WO2011102142A1 - 溶接装置および溶接方法 - Google Patents
溶接装置および溶接方法 Download PDFInfo
- Publication number
- WO2011102142A1 WO2011102142A1 PCT/JP2011/000922 JP2011000922W WO2011102142A1 WO 2011102142 A1 WO2011102142 A1 WO 2011102142A1 JP 2011000922 W JP2011000922 W JP 2011000922W WO 2011102142 A1 WO2011102142 A1 WO 2011102142A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- welding
- data
- shape
- posture
- shape data
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/408—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by data handling or data format, e.g. reading, buffering or conversion of data
- G05B19/4086—Coordinate conversions; Other special calculations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23K—SOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
- B23K37/00—Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23K—SOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
- B23K9/00—Arc welding or cutting
- B23K9/235—Preliminary treatment
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/33—Director till display
- G05B2219/33259—Conversion of measuring robot coordinates to workpiece coordinates
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37116—Shape sensor leads tool, in front of tool
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45104—Lasrobot, welding robot
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/50—Machine tool, machine tool null till machine tool work handling
- G05B2219/50353—Tool, probe inclination, orientation to surface, posture, attitude
Definitions
- Embodiments of the present invention relate to a welding apparatus and a welding method using a welding robot.
- the welding apparatus includes a rail installed on the object to be welded along a welding line, an articulated robot traveling on the rail, and a sensor for measuring a weld bead shape.
- the welding target position is corrected based on the weld bead shape measured by this sensor.
- a method has been proposed in which the welding speed, the target position, and the torch posture are corrected by measuring the groove and the shape of the weld bead (see Patent Document 2).
- An object of the present invention is to provide a welding apparatus and a welding method which do not require a rail for moving a welding robot and enable high quality automatic welding.
- the welding apparatus includes a welding torch and a shape sensor attached to a welding robot, a shape data extraction unit for extracting shape data representing a contour of a welding object from measurement data of the shape sensor, and a position of the shape sensor And a conversion data calculation unit that calculates coordinate conversion data for correcting the shape data based on the posture, a shape data correction unit that corrects the shape data based on the coordinate conversion data, and the correction An angle calculation unit that calculates an inclination angle of the groove to be welded based on shape data; and a welding position / attitude determination unit that determines the position and orientation of the welding torch based on the inclination angle of the groove; And.
- the step of controlling the position and posture of the shape sensor with respect to the welding target based on the position and posture data, and the shape sensor having the position and posture controlled based on the position and posture data Extracting shape data representing the outline of the object to be welded from the measurement data at the step of calculating the coordinate conversion data for correcting the shape data based on the position / posture data; and the coordinate conversion Correcting the shape data using data; extracting a plurality of change points of the shape from the corrected shape data; and corresponding to the end of the bead from the corrected shape data Extracting the change point of the shape of the bead, and based on the corrected shape data and the change points of the plurality of shapes, Calculating the welding condition and the position and posture of the welding torch based on the step of calculating the inclination angle of the groove surface, the width of the bead and the inclination angle of the groove surface, and the welding Welding based on conditions, position and attitude of the welding torch.
- high-quality automatic welding can be performed without the need for a rail for moving the welding robot.
- the welding apparatus includes a slider device 1, a shape sensor processing device 6 that receives data from the slider device 1, and a robot control device 5 that transmits and receives data to and from the shape sensor processing device 6.
- the robot control device 5 has a teaching data storage device 14 and an operation axis control device 15.
- the teaching data storage unit 14 transmits measurement teaching data to the shape sensor processing unit 6.
- the movement axis control device 15 controls the movement of the slider device 1 and the welding robot 2 described later.
- the slider device 1 has a pedestal B1, posts B2, B3 and B4, and a base 7.
- the column B2 is rotatable with respect to the pedestal B1 as shown by the arrow A, with the vertical axis as a pivot.
- the column B3 is movable (linearly movable) in the vertical direction (arrow B) with respect to the column B2.
- the column B4 can move (linearly move) in the front-rear direction (arrow C) with respect to the column B3.
- the base 7 is attached to the front of the column B4. That is, the base 7 can rotate with respect to the axis in the vertical direction, and can move in the vertical direction and in the longitudinal direction.
- the welding robot 2 is installed on the base 7.
- the welding robot 2 has an arm capable of multi-axial rotation at multiple joints. For example, six axes of rotation at six joints are possible.
- first to sixth links (sub-arms) are arranged at the first to sixth joints, respectively. That is, the first joint, the first link, the second joint, the second link,..., The sixth joint, and the sixth link are arranged in order.
- a first joint is placed on the base 7.
- the j-th joint is placed at the tip of the (j-1) th link (1 ⁇ j ⁇ 6).
- the tip of the sixth link corresponds to the tip of the arm.
- the welding torch 3 and the shape sensor 4 are attached to the tip of this arm so as to correspond to each other (for example, the relative position (distance) of the welding torch 3 and the shape sensor 4 is fixed). Measurement data is output from the shape sensor 4 to the shape sensor processing device 6.
- the shape sensor 4 can be configured by a combination of an irradiation device and an imaging device.
- the operation of the welding apparatus will be described with reference to FIG.
- the operation of the welding apparatus is divided into processing steps (steps S1 and S2) in the robot control device 5 and processing steps (steps S3 to S9) in the shape sensor processing device 6.
- teaching data is stored in the teaching data storage device 14.
- the welding torch 3 or the shape sensor 4 is moved to the teaching point by using the operation device provided in the robot control device 5, and the function of storing the position and the posture is selected.
- the teaching data is input and stored in the teaching data storage device 14.
- the teaching data is composed of operation commands including position / attitude data representing the position / attitude of the welding torch 3 and the shape sensor 4 attached to the tip of the arm of the welding robot 2 and welding conditions.
- the teaching data can be divided into welding teaching data used for welding with the welding torch 3 and measurement teaching data used for measurement with the shape sensor 4.
- the position / posture data corresponds to a welding (predicted) line (a line segment representing the axis of a bead formed on the object to be welded). That is, the position of the welding torch 3 (more precisely, the point welded by the welding torch 3) is disposed on the welding line. Also, in general, it is preferable to arrange the welding torch 3 in a direction passing through the center of the inclination angle of the pair of groove surfaces on the vertical surface of the welding line (normal position and posture). Usually, position / posture data is set corresponding to such a position and attitude.
- the position and posture of the welding torch 3 are changed so that the welding apparatus and the welding object do not interfere with each other.
- human interference can be avoided.
- the position and orientation of the shape sensor 4 correspond to the planned welding line. It is assumed that the shape sensor 4 is configured by a combination of an irradiation device and an imaging device. In this case, light is preferably emitted from the irradiation device along the vertical surface of the weld line. For example, it is preferable that the irradiation surface S0 described later coincides with the vertical surface S1 of the welding line. However, the position and orientation of the shape sensor 4 are appropriately changed so that the welding device and the welding object do not interfere with each other.
- teaching data position / posture data
- teaching data is input such that the welding device and the object to be welded do not interfere with each other.
- Different position and posture data can be used for each of the welding with the welding torch 3 and the measurement with the shape sensor 4 (at least one of the position and the posture of the welding torch 3 and the shape sensor 4 is different). However, the positions and postures of the welding torch 3 and the shape sensor 4 may be matched.
- the position / posture data can be divided into reference coordinates representing the mounting position (base 7) of the welding robot 2 and relative coordinates (robot coordinate system) representing the relative displacement of the welding torch 3 or the like from the reference coordinates.
- the reference coordinates and the relative coordinates are used to control the operations of the slider device 1 and the welding robot 2, respectively.
- step S2 movement axes of the slider device 1 and the welding robot 2 are controlled based on the teaching data (measurement teaching data) stored in the teaching data storage step (step S1).
- the operations of the slider device 1 and the welding robot 2 are controlled by the reference coordinates and relative coordinates in the teaching data. That is, the position and orientation of the shape sensor 4 are controlled based on the position and orientation data. After this control, measurement by the shape sensor 4 is performed.
- step S3 coordinate conversion data for correction of shape data (conversion matrices Cn, Cn ', Cn "to be described later based on the teaching data (measurement teaching data) output from the robot control device 5
- the coordinate conversion data calculated in this step is used in steps S5, S8, and S9, the details of the calculation of the coordinate conversion data will be described later.
- step S4 shape data representing the outline of the object to be welded is extracted by performing noise removal and binarization from the measurement data output from the shape sensor 4 in the operation axis control step (step S2) Be done. As described later, distortion occurs in the shape data depending on the position and posture of the shape sensor 4 with respect to the welding object.
- the shape data extracted in the shape data extraction step (step S4) is corrected using the coordinate conversion data calculated in the coordinate conversion data calculation step (step S3). That is, distortion of shape data is reduced.
- the change point of the shape is extracted from the shape data corrected in the sensor posture correction step (step S5).
- This change point corresponds to, for example, the boundary between the top surface of the object to be welded and the groove surface, and the boundary between the weld bead and the groove surface (the end of the weld bead). That is, at the boundary of a plurality of surfaces (for example, the upper surface, the groove surface, and the bead surface), the angle of the contour changes rapidly. For this reason, a change point is extracted as a point where the absolute value of the local gradient (differential amount) in the contour represented by the shape data is large. The details will be described later.
- the bead end is extracted from the change point extracted in the change point extraction step (step S6). Furthermore, the bead surface and the beveled surface are identified. As described above, the change point includes the end of the weld bead (the boundary between the weld bead and the groove surface). As shown in the third embodiment, it is possible to compare shape data before and after the previous welding (welding in the lower layer of the present weld layer), and extract points having a large amount of change in shape as bead ends. The shape data between the two ends of the weld bead correspond to the bead surface. Further, shape data on both sides of the two end portions correspond to a pair of groove surfaces. The details will be described later.
- the welding condition and welding target position are determined from the width of the bead specified in the groove / bead surface extraction process (step S7) and the inclination angle of the pair of groove surfaces.
- the position of the torch 3) and the posture of the welding torch are determined.
- the width of the bead corresponds to the distance between the two ends of the weld bead.
- the welding aim position and welding torch posture calculated in the welding condition calculating step (step S8) are calculated as the position and posture on the robot coordinate system.
- the calculated position data is stored in the teaching data storage device 14 as teaching data (welding teaching data). Welding is performed by the welding device based on the welding instruction data.
- FIG. 3 the water turbine runner 16 is lifted in a standing state by a crane (not shown) and installed on the turning roller 17.
- a welding device constituted by the slider device 1 and the welding robot 2 is installed beside the opening 51 of the water turbine runner 16.
- the water runner 16 is interlocked and rotated.
- the rotation of the water turbine runner 16 is stopped at an angle at which the groove portion (portion corresponding to the groove surface) 53 (see FIG. 6) of the blade 18 (member to be welded) is positioned in front of the welding device.
- the slider device 1 and the welding robot 2 are operated by the motion axis control device 15 provided in the robot control device 5.
- the shape of the groove 53 is measured by the shape sensor 4 attached to the tip of the arm of the welding robot 2.
- the vanes 18, crowns 19 and bands 20 which are members of the water runner 16 each have a three-dimensional curved surface.
- 4 and 5 show an example of the inclination angle of the groove of the blade 18 and the inclination of the weld line.
- the vertical axis in FIG. 4 indicates the inclination angle of the groove.
- the horizontal axis in FIG. 4 indicates the distance in the direction from the inlet (location located on the outer peripheral side of the water runner 16 to take in water) toward the outlet (location located in the middle of the water runner 16 to release water) .
- the vertical axis in FIG. 5 indicates the height from the reference point.
- the horizontal axis in FIG. 5 indicates the distance in the direction from the inlet to the outlet. It can be seen that the inclination angle and the slope of the weld line are continuously changing.
- the shape sensor 4 is composed of a laser slit light irradiator 21 as an irradiation device and a CCD camera 22 as an imaging device.
- the laser slit light irradiator 21 emits a slit-like laser light (slit light).
- the slit light traveling toward the blade 18 to be welded is irradiated on a linear portion (irradiation line) LR intersecting with the irradiation surface (the surface formed by the slit light) S0.
- the irradiation line LR has a shape corresponding to the contour of the object to be welded.
- the image of the irradiation line LR is captured by the CCD camera 22 as measurement data.
- the shape of the slit light (the shape of the irradiation line LR) is extracted as shape data from the measurement data.
- shape data is generated as follows. First, from the image (measurement data) obtained by the CCD camera 22, the pixel (irradiation line LR) of slit light that has hit the object to be welded is extracted. Then, based on the relative position and relative posture (direction) of the light irradiator 21 and the CCD camera 22, the slit light surface (irradiation surface) on which the position (irradiation line LR) of each extracted pixel is irradiated from the light irradiator 21 S0) Convert to the position above. As a result, shape data is generated.
- the irradiated surface S0 be perpendicular to the welding (planned) line. That is, the welding robot 2 is controlled so that the irradiation surface S0 is perpendicular to the welding line, and the position and posture of the shape sensor 4 are determined.
- the position and posture of the shape sensor 4 may be limited, and the irradiated surface S0 and the weld line may not be vertical. That is, depending on the position and attitude of the shape sensor 4, there is a possibility that the members of the water wheel runner, and the welding torch 3 and the shape sensor 4 may interfere (contact). In this case, it is necessary to change the posture of the shape sensor 4 to avoid interference.
- the shape data extracted in the shape data extraction step includes distortion for the change in posture. For this reason, correction of shape data is required.
- the distortion due to the posture change means a deviation from the shape data in the case where the irradiation surface S0 of the slit light is perpendicular to the welding line.
- distortion generally occurs and correction is required.
- the position change and the direction of the irradiation surface S0 are necessary for the correction of the posture change. For this reason, measurement teaching data (position / posture data) is sent from the teaching data storage unit 14 to the shape sensor processing unit 6.
- the correction of the posture change will be described with reference to FIG. It is considered to correct the shape data corresponding to the measurement teaching point (here, a point on the welding schedule line) Pn.
- the coordinate conversion data calculation step (step S3) an arc AC passing through three measurement teaching points Pn-1, Pn, Pn + 1 including the measurement teaching point Pn and the measurement teaching points Pn-1, Pn + 1 before and after that is assumed.
- a tangent vector Xn ' which is in contact with the arc AC at the teaching point Pn is determined.
- the tangent vector Xn ' represents the weld line direction at the teaching point Pn.
- a vertical plane S1 of the weld line including the teaching point Pn and having the vector Xn 'as a normal vector is determined.
- a vector Zn representing the axial direction of the laser slit light irradiator 21 of the shape sensor 4 is determined.
- a vector Zn ' is obtained by projecting the vector Zn on the vertical plane.
- the unit vector N of the tangent vector Xn ' is taken as the unit vector N, and the projection matrix projected onto the plane S1 perpendicular to N is taken as Pn.
- Pn I-N ⁇ N T
- I unit matrix
- N T transposed vector obtained by transposing the unit vector N
- Zn ′ can be calculated by the following equation.
- Zn ' Pn ⁇ Zn
- a vector Yn 'orthogonal to the vectors Xn' and Zn 'obtained above is determined.
- Yn ′ Zn ′ ⁇ Xn ′ (where “x” represents the outer product of vectors)
- a matrix (conversion matrix) Cn ' is calculated that represents the coordinate system of the vertical plane S1 (as viewed from the robot coordinate system) with these vectors Xn', Yn ', and Zn' as coordinate axes and the teaching point Pn as the coordinate origin. .
- the shape sensor 4 (and the welding torch 3) is attached to the arm of the welding robot 2 having, for example, six joints. Therefore, the position and orientation (direction) of the shape sensor 4 are determined in accordance with the operation of the six joints.
- the relative positions and postures at the tips of the first to sixth links connected to the first to sixth joints can be represented by a matrix Ai . That is, the matrix A i represents the position and orientation of the distal end of the first link relative to the robot coordinate. Matrix A i represents the position and orientation of the distal end of the i-th link relative to the distal end of the link of the (i-1).
- the matrix T 6 representing the tip of the arm position and orientation of the (shape sensor 4) (position and direction of the irradiation plane S0 (position and orientation of the teaching point of the measurement teaching data)) of the robot 2, the following Thus, it can be expressed by the product of matrices A 1 to A 6 .
- T 6 A 1 ⁇ A 2 ⁇ A 3 ⁇ A 4 ⁇ A 5 ⁇ A 6 ...
- the matrix A i may include both translational and rotational components.
- the translational component represents a component of coordinate transformation by translational movement of the tip of the i-th link with respect to the tip of the (i-1) th link.
- the rotational component represents a component of coordinate transformation due to rotational movement of the tip of the i-th link with respect to the tip of the (i-1) th link.
- the translational component corresponds to the position of the teaching point Pn. This can be obtained by solving the kinematics equation when the measurement teaching data is stored at the angle of each joint axis.
- the translational component is calculated from the equation (1) corresponding to the teaching data of the teaching point Pn.
- N [Nx, Ny, Nz, 0] T
- O [Ox, Oy, Oz, 0] T
- A [Ax, Ay, Az, each unit vector of the vectors Xn ', Yn', Zn ' 0] T
- the rotation around the Z axis is ⁇ r
- the rotation around the Y axis is ⁇ p
- the rotation around the X axis is ⁇ y (rotation of roll pitch yaw).
- a matrix (conversion matrix) Cn representing the coordinate system of the irradiation surface S0 (viewed from the robot coordinate system) having the vectors Xn, Yn, Zn as coordinate axes and the teaching point Pn as the coordinate origin is an equation similar to the equation (1) It is represented by (2).
- Cn A 1 ⁇ A 2 ⁇ A 3 ⁇ A 4 ⁇ A 5 ⁇ A 6 ... Formula (2) However, the content of the matrix A i of the formula (1), always (different welding robot 2 arm state) that does not match (2).
- the transformation matrices Cn 'and Cn calculated above mean coordinate transformation data.
- the coordinate conversion data is used in the sensor attitude correction process (step S5), the welding condition calculation process (step S8), and the welding position / attitude calculation process (step S9).
- the shape data extracted in the shape data extraction step is corrected. That is, the position matrix Tn 'corresponding to the shape data after correction is calculated from the position matrix Tn corresponding to the point on the shape data (point on the irradiation line LR).
- Tn ′ Cn ′ ⁇ 1 ⁇ Cn ⁇ Tn Equation (3)
- Cn ′ ⁇ 1 represents the inverse of the matrix Cn ′.
- FIG. 8 schematically shows the contents of this coordinate conversion.
- Position data (shape data) on the irradiation surface S0 is converted into shape data on the vertical surface S1 (coordinate conversion).
- the projection of the point Pa on the irradiation surface S0 onto the vertical plane S1 is taken as a point Pb.
- the vectors Va and Vb are respectively represented by coordinates (Xa, Ya, Za) on the irradiation surface S0 and coordinates (Xb ', Yb', Zb ') of the vertical surface S1.
- Vb Cn ' -1 ⁇ Cn ⁇ Va
- the coordinate conversion by "Cn ' -1 ⁇ Cn" corresponds to the projection from the point on the irradiation surface S0 to the vertical surface S1.
- a position matrix Tn ′ corresponding to the points on the shape data after correction is calculated (Equation (3)). From a plurality of position matrices Tn corresponding to respective points of the shape data (points (coordinates) on the irradiation line LR), corresponding to respective points (points on the corrected irradiation line LR) of the shape data after correction A plurality of position matrices Tn 'are calculated.
- the shape data (position matrix Tn ') after correction is used in the change point extraction step (step S6). That is, as shown in FIG. 9, a point having a large angle change (angular difference) between vectors connecting each point of the shape data (point on the irradiation line LR after correction) is extracted as a change point.
- the groove / bead surface extraction step (S7) the following processing is performed. First, two points serving as an open tip are extracted from the extracted points (change points). The point between the two points is identified as the bead end. From the position of the open end portion and the bead end portion on the crown 19 side or the band 20 side, the angle of the beveled surface on the vertical plane S1 of the weld line is calculated. Further, the bead width is calculated from the distance between the bead end portions.
- step S8 the following processing is performed.
- the angle of the grooved surface on the weld line vertical surface S1 calculated in the groove / bead surface extraction step (step S7) and the position of the vertical surface S1 calculated in the coordinate conversion data calculation step (step S3) The inclination angle of the beveled surface in the robot coordinate system is calculated from the transformation matrix Cn ′ representing the posture and the posture.
- welding conditions and aiming position on the vertical line of the weld line and torch posture Find the optimal value of The welding conditions include welding current, welding voltage, welding speed, weaving frequency, amplitude and direction.
- conditional branch expression is stored in the shape sensor processing device 6.
- 10 and 11 show an example of the conditional branch expression.
- FIG. 9 shows a combination of conditions based on the bead width.
- FIG. 10 shows the welding conditions. Multilayer welding is considered here.
- the bead width (the width of the existing (lower layer) bead) is equal to or less than the first value (12 mm)
- condition 3 is selected, welding is performed at the center of the existing bead, and a higher layer bead is formed. If the bead width is greater than the first value (12 mm) and not more than the second value (19 mm), conditions 1 and 3 are sequentially selected, and welding is performed at the right and left two places. Further, if the bead width is larger than the second value (19 mm), conditions 1, 2 and 3 are selected in order, and welding is performed at three locations, right, center and left.
- End of bead in condition 2 of FIG. 11 means the end of bead below the layer to be welded.
- an aim position, a torch attitude, and welding conditions are set.
- the aim position, the torch attitude, and the welding conditions that fit the bead width calculated in the groove / bead surface extraction step (step S7) are obtained.
- conditional branch formulas as shown in FIGS. 10 and 11 are set for each gradient of the weld line. In this way, if the bead width and the gradient of the weld line are determined, the aim position, the torch posture, and the welding condition can be determined.
- the inclination angle of the groove is used as a reference of the torch posture. That is, the torch posture is determined with the beveled surface as a reference surface.
- the above-described aim position, torch posture, and welding conditions are generally set as values on the groove vertical plane S1. For this reason, in said Formula (3), the coordinate of shape data is converted into the coordinate on vertical surface S1.
- a conversion matrix Cn ′ ′ representing the position and posture of the teaching point (welding torch 3) of the welding teaching data is determined. Since it corresponds to the position and posture of the tip of, it can be expressed by the following equation (4), similarly to the equation (2).
- Cn " A 1 ⁇ A 2 ⁇ A 3 ⁇ A 4 ⁇ A 5 ⁇ A 6 ?? (4)
- the content of the matrix A i of the formula (1), (2) does not necessarily coincide with (4) (the welding robot 2 arm states different).
- the aiming position and torch attitude on the welding line vertical plane S1 are converted to the aiming position and torch attitude in the robot coordinate system using the calculated coordinate transformation matrix Cn ′ ′ As described below, a matrix Xd ′ expressed in the robot coordinate system is calculated from the matrix Xd representing the position and posture on the weld line vertical plane S1.
- the robot control device 5 the calculated welding position / posture and welding conditions are stored in the teaching data storage device 14.
- the welding operation can be taught and welding teaching data can be generated.
- the welding operation is performed by automatically reproducing the teaching data.
- the use of the welding device constituted by the slider device 1, the welding robot 2, the welding torch 3, and the shape sensor 4 eliminates the need for a rail for moving the welding robot.
- the robot control device 5 and the shape sensor processing device 6 the degree of freedom of the attitude of the sensor for measuring the shape of the weld bead is improved. As a result, it is possible to provide an automatic welding apparatus and welding method for large complex structures capable of high quality automatic welding.
- a three-dimensional CAD 23 for product design and an off-line teaching system 24 are provided.
- the welding apparatus includes a slider device 1, a shape sensor processing device 26 that receives data from the slider device 1, and a robot control device 5 that transmits and receives data to and from the shape sensor processing device 26.
- the robot control device 5 has a teaching data storage device 14 and an operation axis control device 15.
- the teaching data storage unit 14 transmits measurement teaching data to the shape sensor processing unit 6.
- the movement axis control device 15 controls the movement of the slider device 1 and the welding robot 2.
- the data output from the shape sensor 4 is output to the shape sensor processing device 26.
- the shape sensor processing device 26 carries out the process as shown in FIG.
- teaching data is stored in the teaching data storage device 14.
- teaching data is input using an input device such as a keyboard.
- step S42 movement axes of the slider device 1 and the welding robot 2 are controlled based on the teaching data (measurement teaching data) stored in the teaching data storage step (step S41).
- the motion axis control device 15 drives the slider device 1 and the welding robot 2 to move the shape sensor to the measurement teaching point. Then, measurement data from the shape sensor 4 is acquired.
- a measurement / welding teaching process (step S43), an attitude changing process (step S44), and a conversion data calculation process (step S45) are performed.
- step S43 teaching data is created on a computer.
- step S44 the interference between the construction object such as a large water wheel runner and the welding device is confirmed. When these interfere with each other, the postures of the welding torch 3 and the shape sensor 4 are changed.
- step S45 conversion data of the posture before posture change and the posture after change is calculated.
- shape data extraction step (S46), sensor posture correction step (step S47), change point extraction step (step S48), groove / bead surface extraction step (step S49), welding condition calculation A process (step S50) and a welding position / posture correction process (step S51) are performed.
- the welding position / posture correction process (step S51) is provided as an alternative to the welding position / posture calculation process (step S51) in the first embodiment.
- Steps S46 to S51 correspond to steps S4 to S8 in the first embodiment, and respectively show the same steps. Step S51 will be described later.
- three-dimensional shape data of a construction object such as a water turbine runner is created using the three-dimensional CAD 23 for product design, and is input to the off-line teaching system 24, that is, a digitizing device.
- the off-line teaching system 24 that is, a digitizing device.
- a person operates the welding robot 2 to input data. That is, the interference of the object to be welded and the welding apparatus is manually avoided.
- the digitized data is input to the welding robot 2 using the three-dimensional CAD 23 for product design and the off-line teaching system 24.
- the interference is automatically avoided using three-dimensional data of the welding object and the welding apparatus.
- the three-dimensional shape data of the input construction object is generated in advance, and the three-dimensional shape of the welding device (slider device 1, welding robot 2, welding torch 3, shape sensor 4) Place in a virtual space on a computer with a model. Then, the shape sensor (and welding) is at a position and a direction (posture) passing through the vertical plane S1 of the weld line of the groove portion of the construction object represented by the three-dimensional shape data and the center of the angle between the groove surfaces.
- the teaching data is calculated such that the torch is placed.
- Measurement teaching data and welding teaching data are created by adding an operation command to each teaching point as described above.
- step S44 the presence or absence of interference between the object to be installed and the welding apparatus is confirmed using the measurement and welding teaching data. If there is interference, the postures of the welding torch 3 and the shape sensor 4 included in the teaching data are changed. The welding teaching data whose attitude has been changed is used in the welding position / attitude correction step (step S51). Further, the measurement teaching data is output to the teaching data storage device 14 and the conversion data calculation function 27.
- step S45 rotation conversion data from after the posture change to before the posture change is calculated.
- attitude data of the measurement teaching point in the measurement teaching data after attitude change obtained in the attitude change process step S44
- attitude change obtained in the measurement / welding teaching process step S43
- the shape data after distortion correction is calculated by converting the shape data calculated in the shape data extraction step (step S46) based on the rotation conversion data.
- step S51 the position / attitude data of the welding teaching point in the welding teaching data calculated in the attitude changing process (step S44) is corrected.
- step S50 the welding conditions calculated in the welding condition calculation process (step S50), the aiming position on the welding wire vertical plane, the torch attitude, and the rotation conversion data calculated in the conversion data calculation process (step S45) is used.
- the corrected position / posture data is stored in the teaching data storage device 14 as welding teaching data.
- the motion axis control device 15 drives the slider device 1 and the welding robot 2 based on the welding teaching data stored in the teaching data storage device 14 to perform automatic welding.
- the use of the welding device constituted by the slider device 1, the welding robot 2, the welding torch 3, and the shape sensor 4 eliminates the need for a rail for moving the welding robot.
- the robot control unit 5 the shape sensor processing unit 26, the three-dimensional CAD 23 for product design, and the off-line teaching system 24, the degree of freedom of the attitude of the sensor for measuring the weld bead shape is improved. As a result, it is possible to provide an automatic welding apparatus and welding method for large complex structures capable of high quality automatic welding.
- Graphs 1, 1-1, 1-2, 2 and 3 are shape data, previous shape data (showing shape data at previous welding), current shape data (showing shape at current welding), current The angle change amount of the vector of shape data and the difference between the previous shape data and the current shape data are shown.
- the shape data (graph 1) includes the previous shape data (graph 1-1) and the current shape data (graph 1-2).
- step S6 and S48 in the change point extraction step (steps S6 and S48), four points A, B, C, and D in which the amount of change in angle of the vector of shape data is large are extracted in the order of the amount of change.
- the portion that becomes the crown or the band-side groove surface extracts the shape data end E as the groove surface end. Furthermore, the difference between the previous shape data and the current shape data is calculated.
- the points B and D of the current shape data corresponding to the points b and d where the amount of change in the difference is large are extracted as bead end portions.
- welding conditions, a target position, and a torch posture are selected so that the weld bead does not have an overlap shape that causes penetration defects.
- the lower end shape of the weld bead tends to be smooth compared to other welding postures.
- a phenomenon occurs in which the bead end can not be extracted.
- the difference between the previous shape data and the current shape data is obtained, and a point having a large amount of change is extracted to be the bead end.
- the welding condition which comprises the welding current, welding voltage, welding speed, weaving frequency, amplitude and direction corresponding to the bead width obtained from the position of the bead end, and on the measurement surface Determine the aim position and torch attitude at the
- the bead end position can be reliably determined from the measured shape data. Based on this, high-quality automatic welding can be performed by determining the welding conditions, the target position, and the torch posture.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manufacturing & Machinery (AREA)
- Human Computer Interaction (AREA)
- Plasma & Fusion (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
- Arc Welding In General (AREA)
- Butt Welding And Welding Of Specific Article (AREA)
Abstract
Description
第1の実施の形態の溶接装置について図1を用いて説明をする。この溶接装置は、スライダ装置1と、スライダ装置1からデータを受信する形状センサ処理装置6と、形状センサ処理装置6と相互にデータを送受信するロボット制御装置5と、を有する。ロボット制御装置5は、教示データ記憶装置14と動作軸制御装置15を有する。教示データ記憶装置14は、形状センサ処理装置6へ計測教示データを送信する。動作軸制御装置15は、スライダ装置1と後述する溶接ロボット2の動作を制御する。
Pn=I-N・NT
ここで、I:単位行列、NT:単位ベクトルNを転置した転置ベクトル
以上からZn’は以下の式で算出できる。
Zn’=Pn・Zn
Yn’=Zn’× Xn’ (ここで、「×」はベクトルの外積を表す)
これらのベクトルXn’、Yn’、Zn’を座標軸とし、教示点Pnを座標原点とする(ロボット座標系から見た)鉛直面S1の座標系を表す行列(変換行列)Cn’が算出される。
T6=A1・A2・A3・A4・A5・A6 ……式(1)
Δr=atan2(Ny,Nx)) および Δr=Δr+180°
Δp=atan2(-Nz,cosΔr・Nx-sinΔr・Ny)
Δy=atan2(sinΔr・Ax-cosΔr・Ay,
-sinΔr・Ox+cosΔr・Oy)
Cn=A1・A2・A3・A4・A5・A6 ……式(2)
但し、行列Aiの内容は、式(1)、(2)で必ずしも一致しない(溶接ロボット2のアームの状態が異なる)。
Tn’=Cn’-1・Cn・Tn ……式(3)
ここで、「Cn’-1」は、行列Cn’の逆行列を表す。
Vb=Cn’-1・Cn・Va
Cn”=A1・A2・A3・A4・A5・A6 ……式(4)
但し、行列Aiの内容は、式(1)、(2)、(4)で必ずしも一致しない(溶接ロボット2のアームの状態が異なる)。
ここで、行列Cn”-1は、行列Cn”の逆行列を表わす。
次に、第2の実施の形態について図12を用いて説明する。なお第2の実施の形態と同一の構成には同一の符号を付し、重複する説明は省略する。
第3の実施の形態を、図14を用いて説明する。なお、第1の実施の形態および第2の実施の形態と同一の工程には同一の符号を付し、重複する説明は省略する。
Claims (8)
- 溶接ロボットに取り付けられる溶接トーチおよび形状センサと、
前記形状センサでの計測データから、溶接対象の輪郭を表す形状データを抽出する形状データ抽出部と、
前記形状センサの位置および姿勢に基づいて、前記形状データを補正するための座標変換データを算出する変換データ算出部と、
前記座標変換データに基づいて、前記形状データを補正する形状データ補正部と、
前記補正された形状データに基づいて、前記溶接対象の開先の傾斜角度を算出する角度算出部と、
前記開先の傾斜角度に基づいて、前記溶接トーチの位置と姿勢を決定する溶接位置・姿勢決定部と、
を具備する溶接装置。 - 前記角度算出部が、前記補正された形状データの変化点を抽出する変化点抽出部と、前記抽出された変化点に基づいて、開先面を抽出する開先面抽出部と、この開先面の傾斜角度を算出する角度算出部と、を有する
請求項1記載の溶接装置。 - 前記溶接対象と前記形状センサの3次元形状データに基づき、前記溶接対象と前記形状センサが干渉しない前記形状センサの位置、姿勢を表す位置・姿勢データを生成する位置・姿勢データ生成部と、
をさらに具備する請求項1記載の溶接装置。 - 前記複数の軸は、第1の直線方向の軸と、前記第1の直線方向と異なる第2の方向の軸と、及び回転軸と、を有する
請求項1記載の溶接装置。 - 前記形状センサは、照明装置と撮像装置を有する
請求項1記載の溶接装置。 - 複数の軸を有するスライダ装置と、
前記決定される溶接位置と姿勢に基づき、前記スライダ装置を制御する制御装置と、を具備し、
前記溶接ロボットが、前記複数の軸のいずれかに設置される
請求項1記載の溶接装置。 - 位置・姿勢データに基づいて、溶接対象に対する形状センサの位置および姿勢を制御するステップと、
前記位置・姿勢データに基づいて、位置および姿勢を制御された形状センサでの計測データから、前記溶接対象の輪郭を表す形状データを抽出するステップと、
前記位置・姿勢データに基づいて、前記形状データを補正するための座標変換データを算出するステップと、
前記座標変換データを用いて、前記形状データを補正するステップと、
前記補正された形状データから、形状の変化点を複数抽出するステップと、
前記補正された形状データから、ビードの端部に対応する、複数の形状の変化点を抽出するステップと、
前記補正された形状データおよび前記複数の形状の変化点に基づいて、前記ビードの幅と、開先面の傾斜角度とを算出するステップと、
前記ビードの幅と、前記開先面の傾斜角度に基づいて、溶接条件、前記溶接トーチの位置および姿勢を決定するステップと、
前記溶接条件、前記溶接トーチの位置および姿勢に基づいて、溶接するステップと、
を具備する溶接方法。 - 前記溶接対象の3次元形状データを用いて、溶接線の鉛直面上、かつ一対の開先面間の角度の中心を通る、溶接装置の形状センサおよび溶接トーチの、位置および姿勢を表す第3の位置・姿勢データを決定するステップと、
前記第3の位置・姿勢データに対応して、前記形状センサおよび前記溶接トーチを配置したときの、前記溶接装置と前記溶接対象の干渉の有無を確認するステップと、
前記干渉が確認された場合に、前記溶接装置と前記溶接対象とが干渉しない、前記形状センサおよび前記溶接トーチの、位置および姿勢を表す位置・姿勢データを決定するステップと、
をさらに具備する請求項7記載の溶接方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112012020766A BR112012020766A2 (pt) | 2010-02-18 | 2011-02-18 | aparelho de soldagem e método de soldagem. |
JP2012500519A JP5847697B2 (ja) | 2010-02-18 | 2011-02-18 | 溶接装置および溶接方法 |
CN201180010075.2A CN102762331A (zh) | 2010-02-18 | 2011-02-18 | 焊接装置以及焊接方法 |
US13/564,867 US20130026148A1 (en) | 2010-02-18 | 2012-08-02 | Welding apparatus and welding method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-033769 | 2010-02-18 | ||
JP2010033769 | 2010-02-18 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/564,867 Continuation US20130026148A1 (en) | 2010-02-18 | 2012-08-02 | Welding apparatus and welding method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011102142A1 true WO2011102142A1 (ja) | 2011-08-25 |
Family
ID=44482747
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/000922 WO2011102142A1 (ja) | 2010-02-18 | 2011-02-18 | 溶接装置および溶接方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20130026148A1 (ja) |
JP (1) | JP5847697B2 (ja) |
CN (1) | CN102762331A (ja) |
BR (1) | BR112012020766A2 (ja) |
WO (1) | WO2011102142A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013089101A1 (ja) * | 2011-12-13 | 2013-06-20 | 株式会社東芝 | 溶接ビード整形装置およびその整形方法 |
JP2015176510A (ja) * | 2014-03-18 | 2015-10-05 | スターテクノ株式会社 | ワーク加工装置 |
WO2015186795A1 (ja) * | 2014-06-04 | 2015-12-10 | 株式会社神戸製鋼所 | 溶接条件導出装置 |
CN107755937A (zh) * | 2017-08-31 | 2018-03-06 | 中建钢构有限公司 | 变幅摆动焊接方法、装置及焊接机器人 |
JP2022519185A (ja) * | 2019-01-23 | 2022-03-22 | ヌオーヴォ・ピニォーネ・テクノロジー・ソチエタ・レスポンサビリタ・リミタータ | 改善されたツーリング経路生成を伴う産業ロボット装置、及び改善されたツーリング経路に従って産業ロボット装置を動作させる方法 |
WO2024075849A1 (ja) * | 2022-10-07 | 2024-04-11 | パナソニックIpマネジメント株式会社 | 溶接条件管理方法、溶接条件管理プログラムおよび溶接条件管理システム |
WO2024075850A1 (ja) * | 2022-10-07 | 2024-04-11 | パナソニックIpマネジメント株式会社 | 溶接条件管理方法、溶接条件管理プログラムおよび溶接条件管理システム |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5637753B2 (ja) * | 2010-07-02 | 2014-12-10 | 株式会社東芝 | 溶接狙い位置計測装置 |
JP2013022705A (ja) * | 2011-07-25 | 2013-02-04 | Sony Corp | ロボット装置及びロボット装置の制御方法、コンピューター・プログラム、並びにロボット・システム |
KR101330049B1 (ko) * | 2012-03-06 | 2013-11-18 | 한국기계연구원 | 3차원 특징점 기반 직접 교시 궤적 후보정 방법 |
JP6033047B2 (ja) * | 2012-11-14 | 2016-11-30 | 株式会社ダイヘン | 多層盛溶接装置 |
US20140327746A1 (en) * | 2013-05-06 | 2014-11-06 | Iphoton Solutions, Llc | Volume reconstruction of an object using a 3d sensor and robotic coordinates |
US10480862B2 (en) | 2013-05-23 | 2019-11-19 | Crc-Evans Pipeline International, Inc. | Systems and methods for use in welding pipe segments of a pipeline |
US10040141B2 (en) | 2013-05-23 | 2018-08-07 | Crc-Evans Pipeline International, Inc. | Laser controlled internal welding machine for a pipeline |
WO2016033568A1 (en) | 2014-08-29 | 2016-03-03 | Crc-Evans Pipeline International Inc. | Method and system for welding |
US10695876B2 (en) | 2013-05-23 | 2020-06-30 | Crc-Evans Pipeline International, Inc. | Self-powered welding systems and methods |
US9821415B2 (en) | 2014-03-28 | 2017-11-21 | Crc-Evans Pipeline International, Inc. | Internal pipeline cooler |
US11767934B2 (en) | 2013-05-23 | 2023-09-26 | Crc-Evans Pipeline International, Inc. | Internally welded pipes |
US10589371B2 (en) | 2013-05-23 | 2020-03-17 | Crc-Evans Pipeline International, Inc. | Rotating welding system and methods |
CN103273490B (zh) * | 2013-05-30 | 2015-09-02 | 青岛博智达自动化技术有限公司 | 一种用于焊接的工业机器人 |
US9314878B2 (en) * | 2013-09-12 | 2016-04-19 | Ford Global Technologies, Llc | Non-destructive aluminum weld quality estimator |
CN104043891B (zh) * | 2014-06-23 | 2016-06-29 | 吉林市金易科焊接技术有限公司 | 具有遥控功能的焊炬上下调节装置 |
CN104191068B (zh) * | 2014-08-26 | 2016-04-13 | 福建省天大精诺信息有限公司 | 一种焊接路径控制方法、装置及系统 |
JP5980867B2 (ja) * | 2014-10-07 | 2016-08-31 | ファナック株式会社 | ロボットをオフラインで教示するロボット教示装置 |
CN104999202B (zh) * | 2015-08-06 | 2016-09-07 | 苏州五圣通机器人自动化有限公司 | 一种高精度机器人自动焊接装置及其工作方法 |
CN106466907A (zh) * | 2015-08-21 | 2017-03-01 | 宁波弘讯科技股份有限公司 | 移取装置及移取方法 |
JP6640553B2 (ja) * | 2015-12-22 | 2020-02-05 | 株式会社東芝 | 溶接方法 |
US11458571B2 (en) | 2016-07-01 | 2022-10-04 | Crc-Evans Pipeline International, Inc. | Systems and methods for use in welding pipe segments of a pipeline |
US10668577B2 (en) | 2016-09-01 | 2020-06-02 | Crc-Evans Pipeline International Inc. | Cooling ring |
JP6705847B2 (ja) * | 2018-02-14 | 2020-06-03 | ファナック株式会社 | 加工結果に基づいた学習制御を行うロボットシステム及びその制御方法 |
CN110456729B (zh) * | 2018-05-07 | 2021-09-28 | 苏州睿牛机器人技术有限公司 | 一种轨迹跟踪控制方法及轨迹跟踪系统 |
KR102083555B1 (ko) * | 2018-06-07 | 2020-03-02 | 삼성중공업 주식회사 | 용접 로봇 및 이를 이용한 용접 방법 |
CN109822194A (zh) * | 2019-01-24 | 2019-05-31 | 江苏理工学院 | 一种焊缝跟踪装置及焊接方法 |
JP7359657B2 (ja) * | 2019-11-07 | 2023-10-11 | ファナック株式会社 | ロボットの位置または姿勢の修正方法を判定する制御装置 |
CN111230364B (zh) * | 2020-02-20 | 2021-09-21 | 北京博清科技有限公司 | 焊枪角度指导系统和焊枪角度指导方法 |
JP6768985B1 (ja) * | 2020-07-15 | 2020-10-14 | 日鉄エンジニアリング株式会社 | 開先形状測定方法、自動溶接方法、および自動溶接装置 |
JP7469264B2 (ja) | 2021-07-28 | 2024-04-16 | 株式会社神戸製鋼所 | 造形装置の制御方法、造形装置及びプログラム |
KR102615646B1 (ko) * | 2022-01-28 | 2023-12-19 | 삼성엔지니어링 주식회사 | 용접 그루브 형성 방법 및 중공형 물품 |
CN114714355B (zh) * | 2022-04-14 | 2023-08-08 | 广州东焊智能装备有限公司 | 自主移动焊接机器人嵌入式视觉跟踪控制系统 |
CN116571852B (zh) * | 2023-07-11 | 2023-09-26 | 四川吉埃智能科技有限公司 | 一种机器人螺柱自动焊接方法和系统 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000094131A (ja) * | 1998-09-25 | 2000-04-04 | Kobe Steel Ltd | 溶接姿勢教示方法及びその装置 |
JP2000167666A (ja) * | 1998-12-04 | 2000-06-20 | Hitachi Ltd | 自動溶接及び欠陥補修方法並びに自動溶接装置 |
JP2001328092A (ja) * | 2000-05-22 | 2001-11-27 | Mitsubishi Heavy Ind Ltd | ロボットの干渉回避装置及び方法 |
JP2002120066A (ja) * | 2000-10-12 | 2002-04-23 | Kawasaki Heavy Ind Ltd | 溶接用センサ |
JP2002273675A (ja) * | 2001-03-16 | 2002-09-25 | Kawasaki Heavy Ind Ltd | ロボット制御方法およびロボット制御システム |
JP2009039724A (ja) * | 2007-08-07 | 2009-02-26 | Jfe Engineering Kk | ガスシールドアーク溶接方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3004166A (en) * | 1958-09-16 | 1961-10-10 | Air Reduction | Line tracer apparatus and method |
JPH06324733A (ja) * | 1993-05-12 | 1994-11-25 | Fanuc Ltd | センサ付きロボットの制御方法及び装置 |
JPH07129217A (ja) * | 1993-10-29 | 1995-05-19 | Fanuc Ltd | レーザセンサを用いたロボット制御方法 |
JP3384335B2 (ja) * | 1998-09-02 | 2003-03-10 | 松下電器産業株式会社 | 自動組立装置および自動組立方法 |
US6205364B1 (en) * | 1999-02-02 | 2001-03-20 | Creo Ltd. | Method and apparatus for registration control during processing of a workpiece particularly during producing images on substrates in preparing printed circuit boards |
US6392192B1 (en) * | 1999-09-15 | 2002-05-21 | W. A. Whitney Co. | Real time control of laser beam characteristics in a laser-equipped machine tool |
JP3806342B2 (ja) * | 2001-11-26 | 2006-08-09 | 三菱重工業株式会社 | 3次元形状物溶接方法及びその装置 |
JP4578056B2 (ja) * | 2003-02-06 | 2010-11-10 | 株式会社ダイヘン | 作業ロボットを用いた制御システムによるワーク加工方法 |
CN101559512B (zh) * | 2009-05-21 | 2011-05-04 | 山东大学 | 基于激光测距的平板对接焊缝焊接轨迹检测与控制方法 |
-
2011
- 2011-02-18 WO PCT/JP2011/000922 patent/WO2011102142A1/ja active Application Filing
- 2011-02-18 JP JP2012500519A patent/JP5847697B2/ja active Active
- 2011-02-18 CN CN201180010075.2A patent/CN102762331A/zh active Pending
- 2011-02-18 BR BR112012020766A patent/BR112012020766A2/pt not_active IP Right Cessation
-
2012
- 2012-08-02 US US13/564,867 patent/US20130026148A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000094131A (ja) * | 1998-09-25 | 2000-04-04 | Kobe Steel Ltd | 溶接姿勢教示方法及びその装置 |
JP2000167666A (ja) * | 1998-12-04 | 2000-06-20 | Hitachi Ltd | 自動溶接及び欠陥補修方法並びに自動溶接装置 |
JP2001328092A (ja) * | 2000-05-22 | 2001-11-27 | Mitsubishi Heavy Ind Ltd | ロボットの干渉回避装置及び方法 |
JP2002120066A (ja) * | 2000-10-12 | 2002-04-23 | Kawasaki Heavy Ind Ltd | 溶接用センサ |
JP2002273675A (ja) * | 2001-03-16 | 2002-09-25 | Kawasaki Heavy Ind Ltd | ロボット制御方法およびロボット制御システム |
JP2009039724A (ja) * | 2007-08-07 | 2009-02-26 | Jfe Engineering Kk | ガスシールドアーク溶接方法 |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9533377B2 (en) | 2011-12-13 | 2017-01-03 | Kabushiki Kaisha Toshiba | Weld bead shaping apparatus and weld bead shaping method |
JP2013123716A (ja) * | 2011-12-13 | 2013-06-24 | Toshiba Corp | 溶接ビード整形装置およびその整形方法 |
CN103987485A (zh) * | 2011-12-13 | 2014-08-13 | 株式会社东芝 | 焊珠成形设备和焊珠成形方法 |
WO2013089101A1 (ja) * | 2011-12-13 | 2013-06-20 | 株式会社東芝 | 溶接ビード整形装置およびその整形方法 |
CN103987485B (zh) * | 2011-12-13 | 2017-03-08 | 株式会社东芝 | 焊珠成形设备和焊珠成形方法 |
JP2015176510A (ja) * | 2014-03-18 | 2015-10-05 | スターテクノ株式会社 | ワーク加工装置 |
JP2015229169A (ja) * | 2014-06-04 | 2015-12-21 | 株式会社神戸製鋼所 | 溶接条件導出装置 |
WO2015186795A1 (ja) * | 2014-06-04 | 2015-12-10 | 株式会社神戸製鋼所 | 溶接条件導出装置 |
US10065259B2 (en) | 2014-06-04 | 2018-09-04 | Kobe Steel, Ltd. | Welding condition derivation device |
CN107755937A (zh) * | 2017-08-31 | 2018-03-06 | 中建钢构有限公司 | 变幅摆动焊接方法、装置及焊接机器人 |
JP2022519185A (ja) * | 2019-01-23 | 2022-03-22 | ヌオーヴォ・ピニォーネ・テクノロジー・ソチエタ・レスポンサビリタ・リミタータ | 改善されたツーリング経路生成を伴う産業ロボット装置、及び改善されたツーリング経路に従って産業ロボット装置を動作させる方法 |
JP7333821B2 (ja) | 2019-01-23 | 2023-08-25 | ヌオーヴォ・ピニォーネ・テクノロジー・ソチエタ・レスポンサビリタ・リミタータ | 改善されたツーリング経路生成を伴う産業ロボット装置、及び改善されたツーリング経路に従って産業ロボット装置を動作させる方法 |
WO2024075849A1 (ja) * | 2022-10-07 | 2024-04-11 | パナソニックIpマネジメント株式会社 | 溶接条件管理方法、溶接条件管理プログラムおよび溶接条件管理システム |
WO2024075850A1 (ja) * | 2022-10-07 | 2024-04-11 | パナソニックIpマネジメント株式会社 | 溶接条件管理方法、溶接条件管理プログラムおよび溶接条件管理システム |
Also Published As
Publication number | Publication date |
---|---|
BR112012020766A2 (pt) | 2016-05-03 |
CN102762331A (zh) | 2012-10-31 |
JP5847697B2 (ja) | 2016-01-27 |
JPWO2011102142A1 (ja) | 2013-06-17 |
US20130026148A1 (en) | 2013-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011102142A1 (ja) | 溶接装置および溶接方法 | |
US7034249B2 (en) | Method of controlling the welding of a three-dimensional structure | |
JP5715809B2 (ja) | ロボットの作業プログラム作成方法、ロボットの作業プログラム作成装置、及びロボット制御システム | |
EP3863791B1 (en) | System and method for weld path generation | |
US20090179021A1 (en) | Welding robot | |
US20130133168A1 (en) | Profile measuring apparatus, structure manufacturing system, method for measuring profile, method for manufacturing structure, and non-transitory computer readable medium | |
EP3539710B1 (en) | Applying a cladding layer to a component | |
JP5458769B2 (ja) | ロボットの制御装置 | |
JPWO2020121396A1 (ja) | ロボットキャリブレーションシステム及びロボットキャリブレーション方法 | |
WO2013089101A1 (ja) | 溶接ビード整形装置およびその整形方法 | |
NO309367B1 (no) | Automatisk sporfölgingssystem for sveising av rörledninger | |
JP6771288B2 (ja) | 溶接装置及び溶接装置の制御方法 | |
JP2012135781A (ja) | レーザ加工ロボットの教示方法及び教示装置 | |
JP2011138275A (ja) | アーク溶接ロボットの制御装置及びプログラム | |
WO2018143056A1 (ja) | アーク点調整棒取付構造、及び多関節溶接ロボット、並びに溶接装置 | |
JP7040932B2 (ja) | 溶接位置検出装置、溶接位置検出方法及び溶接ロボットシステム | |
US11203117B2 (en) | Teaching data generation system for vertical multi-joint robot | |
JP2007144538A (ja) | ロボットのティーチングデータ作成方法 | |
WO2021111759A1 (ja) | リペア溶接装置およびリペア溶接方法 | |
JP3937814B2 (ja) | 自動溶接装置 | |
CN207223255U (zh) | 船用大中型弯管环焊缝的自动化焊接装置 | |
Jones et al. | Development of a collaborative robot (COBOT) for increased welding productivity and quality in the shipyard | |
WO2020251036A1 (ja) | リペア溶接システム | |
JP2000117466A (ja) | Yagレーザ加工機のティーチング方法およびその装置 | |
JP4109394B2 (ja) | 作業ロボットの設置状態検出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180010075.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11744433 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012500519 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2098/KOLNP/2012 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11744433 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112012020766 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112012020766 Country of ref document: BR Kind code of ref document: A2 Effective date: 20120817 |