CN114888501A - Teaching-free programming building component welding device and method based on three-dimensional reconstruction - Google Patents

Teaching-free programming building component welding device and method based on three-dimensional reconstruction Download PDF

Info

Publication number
CN114888501A
CN114888501A CN202210551258.5A CN202210551258A CN114888501A CN 114888501 A CN114888501 A CN 114888501A CN 202210551258 A CN202210551258 A CN 202210551258A CN 114888501 A CN114888501 A CN 114888501A
Authority
CN
China
Prior art keywords
welding
component
mechanical arm
point cloud
teaching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210551258.5A
Other languages
Chinese (zh)
Inventor
姜凯
刘界鹏
李帅
梁全雷
郑星
张龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Construction Fourth Engineering Division Corp Ltd
Original Assignee
China Construction Fourth Engineering Division Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction Fourth Engineering Division Corp Ltd filed Critical China Construction Fourth Engineering Division Corp Ltd
Priority to CN202210551258.5A priority Critical patent/CN114888501A/en
Publication of CN114888501A publication Critical patent/CN114888501A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • B23K37/02Carriages for supporting the welding or cutting element
    • B23K37/0252Steering means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Numerical Control (AREA)

Abstract

The invention relates to the technical field of building construction, in particular to a teaching-free programming building component welding device and method based on three-dimensional reconstruction. A teaching-free programming building component welding device based on three-dimensional reconstruction comprises a welding device: the welding device provides welding materials and energy for a welding task; mechanical arm: the pose and the moving speed of the tail end welding gun are controlled; a visual reconstruction module: generating three-dimensional point cloud data information; a computer workstation: the welding system is used for receiving and processing the three-dimensional point cloud data information generated by the visual reconstruction module to generate a welding program, and controlling and guiding the mechanical arm to complete welding of a welding component. The device is more suitable for producing small-batch and nonstandard-welded steel members for buildings, can improve the working efficiency and has wider application prospect.

Description

Teaching-free programming building component welding device and method based on three-dimensional reconstruction
Technical Field
The invention relates to the technical field of building construction, in particular to a teaching-free programming building component welding device and method based on three-dimensional reconstruction.
Background
The steel structural member is the main member of buildings such as high buildings, bridges and the like. The main structure of the steel structural member is an H-shaped steel beam or a box-shaped beam, and because the steel structural member has larger size and less long and straight welding seams, the steel structural member can be welded by a submerged arc automatic welding machine at present, and has stable quality and higher efficiency. However, the size of the structural part on the submerged arc welding machine is relatively small, submerged arc automatic welding cannot be carried out, manual welding is mainly carried out by workers at present, and welding quality cannot be guaranteed.
With the development of automation technology, the mechanical arm welding robot is widely applied to the manufacturing industries of standardized production lines such as automobile manufacturing, mold production and the like. The robot arm welding machine is mainly divided into a teaching mode and an off-line programming mode according to the operation mode. The teaching mode requires that the placing position of the component is fixed and an operator has higher operation capacity, and a teaching machine of the mechanical arm is used for programming; the off-line programming mode completes the programming of the welding track based on the three-dimensional design model information of the welding component, and requires that the deviation between the design and the processing is small. The two operating modes described above are therefore only suitable for the production of standard components or components in the presence of a holding fixture. Because the processing precision and the assembly precision of the building component are not high in the production process, the position placement is not fixed in the production process of the component, and the production of the non-standard component is not suitable for the production of the production line scene.
Disclosure of Invention
Accordingly, an object of the present invention is to provide a teaching-free programming building component welding apparatus and method based on three-dimensional reconstruction, which is suitable for the production of small-lot, non-standard welding steel components for construction and can improve the work efficiency.
The technical scheme of the invention is as follows: a non-teaching programming building component welding device based on three-dimensional reconstruction comprises
The welding device comprises: the welding device provides welding materials and energy for a welding task;
mechanical arm: the pose and the moving speed of the tail end welding gun are controlled;
a visual reconstruction module: generating three-dimensional point cloud data information;
a computer workstation: the welding system is used for receiving and processing the three-dimensional point cloud data information generated by the visual reconstruction module to generate a welding program, and controlling and guiding the mechanical arm to complete welding of a welding component.
According to the invention, rapid teaching programming aiming at different welding components can be rapidly completed without a special robot operator, and a program for extracting a welding seam and generating a path plan is completed based on the three-dimensional reconstruction information of the components. The device is more suitable for the production of small-batch and nonstandard welded steel members for buildings, and can improve the working efficiency.
Further preferably, the visual reconstruction module comprises three supports, the three supports are provided with depth cameras, the three supports are placed in a triangular mode, and shooting angles of the three depth cameras face the position where the welding component is placed. After the position and pose of the three depth cameras are adjusted, camera calibration of the three depth cameras and coordinate system calibration of the visual reconstruction module are completed in a computer workstation, and a calibrated visual module system collects component three-dimensional information on a workbench, wherein the three-dimensional information is represented in a point cloud mode.
Further preferably, the support is a cylindrical support.
Further preferably, the visual reconstruction module further comprises a connecting rod and an angle adjusting knob, one end of the connecting rod is connected with the support, the other end of the connecting rod is connected with the depth camera, and the angle adjusting knob is arranged on one side of the connecting rod. The pose angles of the three depth cameras can be adjusted through the angle adjusting knobs.
Further preferably, the connecting rod is a spherical connecting rod. The spherical connecting rod is simple in connection mode, and the angle of the depth camera is easy to adjust.
Further preferably, the support comprises a base and a cylindrical supporting rod, the cylindrical supporting rod is fixed on the base, and the connecting rod is arranged at the upper end of the cylindrical supporting rod.
Further preferably, the distance between the three depth cameras and the ground is 1-2 m. Preferably, the three depth cameras are 1.5 meters from the ground. The height setting is reasonable, and the welding member can be clearly shot.
Specifically, the method for programming the welding device of the building component based on three-dimensional reconstruction without teaching is applied, wherein the method comprises the following steps:
and (3) environment construction: installing a visual reconstruction module, taking a welding component as a central object, placing the welding component in a triangular shape, and installing three infrared depth cameras;
calibrating a depth camera: calibrating a depth camera of the small-hole imaging model by adopting a Zhang Zhengyou calibration method; generating black and white checkerboards by using OpenCV, printing and manufacturing the checkerboards into calibration boards, respectively placing the calibration boards at different positions and different angles, and sequentially shooting by using a depth camera; performing corner detection on the shot picture, extracting sub-pixel corner coordinate values and calculating three-dimensional coordinates of the extracted corners to obtain an internal reference matrix and distortion parameters of the camera to finish camera calibration;
calibrating the hands and eyes: the method comprises the steps that a depth camera shoots a component to be welded to obtain the relative position of the component to be welded in a camera coordinate system; the relative position of the mechanical arm base and the camera is kept unchanged, a conversion matrix between a mechanical arm base coordinate system and a camera coordinate system is solved, and hand-eye calibration is carried out in a mode that eyes are outside the hands;
mechanical arm kinematics analysis: describing the motion relation of the mechanism by using link parameters, wherein the parameters comprise link length, link corner, link offset and joint angle, and describing the mechanical arm mathematical model by using DH parameters; solving positive and inverse motion of the mechanical arm by using DH parameters, and establishing a transformation matrix of an end effector coordinate system relative to a base coordinate system to form a positive and inverse kinematics equation of the mechanical arm;
multi-angle point cloud splicing: three cameras finish multi-angle point cloud data acquisition of a component to be welded, and point cloud splicing is carried out through a point cloud registration method; registering and splicing the point clouds by using an iterative closest point algorithm to obtain point cloud data of a complete welding component, and finishing three-dimensional reconstruction;
weld extraction and generation of spatial paths: carrying out straight-through filtering on the reconstructed point cloud, setting parameters in the x, y and z directions according to straight-through filtering parameters and the installation position of the depth camera, ensuring that the collected component point cloud is contained in a parameter limit range, and taking out more background point cloud information; further calculating the most values of the point cloud after the straight-through filtering in the x direction, the y direction and the z direction, respectively giving the sizes of voxels in the x direction, the y direction and the z direction, and performing down-sampling on divided voxel small squares by adopting voxel down-sampling taking an average value; further, randomly adopting a consistency method to complete the segmentation and the extraction of the point clouds of the component and the ground, and taking the point cloud data of the area with changed vector of the component extraction method as a welding line point; and further fitting the extracted welding line point data by using RANSAC to obtain a space straight line point of the welding line, and obtaining a welding starting point and a welding finishing point, wherein the point on the straight line is the position of the tail end of the welding robot for moving welding, and the space path of the movement of the mechanical arm is obtained by using the coordinate conversion relation between the point cloud coordinate system and the robot coordinate system obtained by hand-eye calibration.
Specifically, in the environment building step, each depth camera is aligned to a component to be welded and ensures that an object is shot, the initial position of the mechanical arm, namely the shooting reconstruction position of the depth camera, is determined, the component is not shielded by the mechanical arm during shooting, and the visibility of the component is ensured.
Specifically, in the depth camera calibration step, the depth camera image heads are used for shooting sequentially, and 10 pictures are shot in total.
Compared with the prior art, the beneficial effects are: according to the invention, rapid teaching programming aiming at different welding components can be rapidly completed without a special robot operator, and the program is a program for completing welding seam extraction and path planning generation based on component three-dimensional reconstruction information. The device is more suitable for the production of small-batch and nonstandard-welded steel members for buildings, can improve the working efficiency and has wider application prospect.
For a better understanding and practice, the invention is described in detail below with reference to the accompanying drawings.
Drawings
Fig. 1 is a schematic view of the overall structure of the present invention.
Fig. 2 is a schematic diagram of a visual reconstruction module of the present invention.
Fig. 3 is an enlarged schematic view of the invention at a in fig. 2.
FIG. 4 is a schematic flow chart of the method of the present invention.
Fig. 5 is a schematic view of the overall module of the present invention.
Detailed Description
The terms of orientation of up, down, left, right, front, back, top, bottom, and the like, referred to or may be referred to in this specification, are defined relative to their configuration, and are relative concepts. Therefore, it may be changed according to different positions and different use states. Therefore, these and other directional terms should not be construed as limiting terms.
The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of implementations consistent with certain aspects of the present disclosure.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
1-3, a non-teaching programming building component welding device based on three-dimensional reconstruction, which comprises
The welding device 1: the welding device 1 provides welding materials and energy for a welding task;
the mechanical arm 2: the pose and the moving speed of the tail end welding gun are controlled; the mechanical arm receives the welding path of the computer workstation, so that the mechanical arm completes a corresponding welding task to the welding component.
The visual reconstruction module 4: generating three-dimensional point cloud data information; the depth camera is used for collecting and transmitting a depth map to a computer workstation.
Computer workstation 3: the welding system is used for receiving and processing the three-dimensional point cloud data information generated by the visual reconstruction module 4 to generate a welding program, and controlling and guiding the mechanical arm 2 to complete welding of the welding component 5.
In the invention, the computer workstation 3 (computer server) is composed of a computer host, a computer peripheral and a display device and is used for operating a software program communicated with the mechanical arm controller and finishing the processing reconstruction of acquired information and the extraction of welding seams.
The mechanical arm 2 is connected with the computer workstation 3 and is a six-axis industrial mechanical arm, and a welding gun is fixed at the tail end of the mechanical arm; the mechanical arm 2 receives the tail end space running path track processed by the computer workstation 3, and completes automatic programming of the mechanical arm, so that the mechanical arm automatically moves and changes the pose along the welding path of the member to be welded; the welding device and the communication module control the welding process and the transmission of welding process information.
In this embodiment, a dedicated robot operator is not required to quickly complete the quick teaching programming for different welding components 5, but the procedures of welding seam extraction and path planning generation are completed based on the three-dimensional reconstruction information of the components. The device is more suitable for the production of small-batch and nonstandard welded steel members for buildings, and can improve the working efficiency.
Specifically, the visual reconstruction module 4 includes three brackets 41, the number of the brackets 41 is three, the three brackets 41 are all provided with depth cameras 44, the three brackets 41 are placed in a triangular shape, and the shooting angles of the three depth cameras 44 are all towards the positions where the welding members 5 are placed. The holder 41 is a cylindrical holder.
After the position and pose of the three depth cameras are adjusted, camera calibration of the three depth cameras and coordinate system calibration of the visual reconstruction module are completed in a computer workstation, and a calibrated visual module system collects component three-dimensional information on a workbench, wherein the three-dimensional information is represented in a point cloud mode.
Specifically, the visual reconstruction module 4 further includes a connecting rod 43 and an angle adjusting knob 42, one end of the connecting rod 43 is connected to the bracket 41, the other end of the connecting rod 43 is connected to the depth camera 44, and the angle adjusting knob 42 is disposed on one side of the connecting rod 43. The attitude angles of the three depth cameras 44 can be adjusted by the angle adjustment knob 42.
Specifically, the link 43 is a spherical link, and the connection manner of the spherical link is simple, and the angle of the depth camera is easily adjusted.
Further, the bracket 41 includes a base and a cylindrical support fixed to the base, and the connecting rod 43 is disposed at an upper end of the cylindrical support. The distance between the three depth cameras 44 and the ground is 1 m-2 m, and specifically, the distance between the three depth cameras 44 and the ground is 1.5 m. The height setting is reasonable, and the welding member 5 can be shot clearly.
In this embodiment, the visual reconstruction module 4 includes a cylindrical support 41 arranged in a triangle, an angle adjusting knob 42, a spherical connecting rod 43, and a depth camera 44; the depth cameras 44 are fixed on the spherical connecting rods 43, shooting angles face the positions where the welding components 5 are placed, the cylindrical supports 41 are arranged in a triangular shape, the pose angles of the three fixed depth cameras 44 can be adjusted through the angle adjusting knobs 42, after the poses of the three depth cameras 44 are adjusted, camera calibration of the three depth cameras and coordinate system calibration of a visual reconstruction module are completed in the computer workstation 3, a calibrated visual module system collects three-dimensional information of the components on the workstation, and the three-dimensional information is expressed in a point cloud mode.
The computer workstation receives the point cloud data, the point cloud edge information acquisition is completed by adopting an edge detection processing algorithm, the spatial topological relation is used for screening the welding seam edge information, the welding seam extraction is realized, the acquired welding seam information is converted into the welding path point information under the mechanical arm coordinate system by depending on the extracted edge information and combining the coordinate system conversion relation of the mechanical arm and the reconstruction system, the path planning of the mechanical arm is realized, and the mechanical arm welding program is generated.
And the controller of the mechanical arm receives the welding program, controls the mechanical arm to move to adjust the position and the posture of the welding gun, and controls the welding device to complete welding at the welding seam of the component to be welded.
The visual reconstruction module comprises three infrared binocular depth cameras 44, each camera is fixed through a holder, and the distance between each camera and the ground is 1.5 m; when the three cameras are installed and built, the centers of the cameras are all aligned to the components to be welded and are arranged on the site in a triangular mode; utilizing a square calibration plate to calibrate and calibrate the three cameras, and establishing a coordinate system of the cameras; the visual reconstruction module is connected with a computer workstation in a wired USB3.0 mode, and further synchronously transmits the pictures and the point cloud information acquired by the camera to the computer workstation, wherein the acquired frame rate is set to be 25 FPS.
According to the invention, rapid teaching programming aiming at different welding components can be rapidly completed without a special robot operator, and the program is a program for completing welding seam extraction and path planning generation based on component three-dimensional reconstruction information. The device is more suitable for the production of small-batch and nonstandard-welded steel members for buildings, can improve the working efficiency and has wider application prospect.
A method for applying a teach-less programmed building element welding apparatus based on three-dimensional reconstruction, as shown in fig. 1-5, wherein the following steps are included:
and (3) environment construction: installing a visual reconstruction module 4, taking the welding component 5 as a central object, placing the three infrared depth cameras 44 in a triangular shape;
calibrating a depth camera: calibrating a depth camera of the small-hole imaging model by adopting a Zhang Zhengyou calibration method; generating black and white checkerboards by using OpenCV, printing and manufacturing the checkerboards into calibration boards, respectively placing the calibration boards at different positions and different angles, and sequentially shooting by using a depth camera; performing corner detection on the shot picture, extracting sub-pixel corner coordinate values and calculating three-dimensional coordinates of the extracted corners to obtain an internal reference matrix and distortion parameters of the camera to finish camera calibration;
calibrating the hands and eyes: the method comprises the steps that a depth camera shoots a component to be welded to obtain the relative position of the component to be welded in a camera coordinate system; the relative position of the mechanical arm base and the camera is kept unchanged, a conversion matrix between a mechanical arm base coordinate system and a camera coordinate system is solved, and hand-eye calibration is carried out in a mode that eyes are outside the hands;
mechanical arm kinematics analysis: describing the motion relation of the mechanism by using link parameters, wherein the parameters comprise link length, link corner, link offset and joint angle, and describing the mechanical arm mathematical model by using DH parameters; solving positive and inverse motion of the mechanical arm by using DH parameters, and establishing a transformation matrix of an end effector coordinate system relative to a base coordinate system to form a positive and inverse kinematics equation of the mechanical arm;
multi-angle point cloud splicing: three cameras finish multi-angle point cloud data acquisition of a component to be welded, and point cloud splicing is carried out through a point cloud registration method; registering and splicing the point clouds by using an iterative closest point algorithm to obtain point cloud data of a complete welding component, and finishing three-dimensional reconstruction;
weld extraction and generation of spatial paths: carrying out straight-through filtering on the reconstructed point cloud, setting parameters in the x, y and z directions according to straight-through filtering parameters and the installation position of the depth camera, ensuring that the collected component point cloud is contained in a parameter limit range, and taking out more background point cloud information; further calculating the most values of the point cloud after the straight-through filtering in the x direction, the y direction and the z direction, respectively giving the sizes of voxels in the x direction, the y direction and the z direction, and performing down-sampling on divided voxel small squares by adopting voxel down-sampling taking an average value; further, randomly adopting a consistency method to complete the segmentation and the extraction of the point clouds of the component and the ground, and taking the point cloud data of the area with changed vector of the component extraction method as a welding line point; and further fitting the extracted welding line point data by using RANSAC to obtain a space straight line point of the welding line, and obtaining a welding starting point and a welding finishing point, wherein the point on the straight line is the position of the tail end of the welding robot for moving welding, and the space path of the movement of the mechanical arm is obtained by using the coordinate conversion relation between the point cloud coordinate system and the robot coordinate system obtained by hand-eye calibration.
In the environment building step, each depth camera aims at a component to be welded and ensures a shooting object, the initial position of the mechanical arm, namely the shooting reconstruction position of the depth camera, is determined, the component is not shielded by the mechanical arm during shooting, and the visibility of the component is ensured.
In the depth camera calibration step, the depth camera image heads are used for shooting in sequence, and 10 pictures are shot in total.
The method is used for realizing three-dimensional rapid acquisition of the components and efficiently finishing path planning of the welding point of the mechanical arm. According to the invention, the generation of a welding program is completed in a visual three-dimensional modeling mode without teaching a process in advance and depending on design three-dimensional model information, so that the applicability of a mechanical arm welding system is improved; the method is beneficial to solving the problem that the existing small-batch and nonstandard welded steel members of the building are produced by manual welding.
It should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (9)

1. A non-teaching programming building component welding device based on three-dimensional reconstruction is characterized by comprising
Welding device (1): the welding device (1) provides welding materials and energy for a welding task;
mechanical arm (2): the pose and the moving speed of the tail end welding gun are controlled;
visual reconstruction module (4): generating three-dimensional point cloud data information;
computer workstation (3): the welding system is used for receiving and processing the three-dimensional point cloud data information generated by the visual reconstruction module (4) to generate a welding program, and controlling and guiding the mechanical arm (2) to complete welding on the welding component (5).
2. The teaching-free programming building component welding device based on three-dimensional reconstruction as claimed in claim 1, characterized in that: the visual reconstruction module (4) comprises three brackets (41), the three brackets (41) are provided with depth cameras (44), the three brackets (41) are arranged in a triangular shape, and the shooting angles of the three depth cameras (44) face the placing positions of the welding components (5); the bracket (41) is a cylindrical bracket.
3. The teaching-free programming building component welding device based on three-dimensional reconstruction as claimed in claim 2, characterized in that: the visual reconstruction module (4) further comprises a connecting rod (43) and an angle adjusting knob (42), one end of the connecting rod (43) is connected with the support (41), the other end of the connecting rod (43) is connected with the depth camera (44), and the angle adjusting knob (42) is arranged on one side of the connecting rod (43); the connecting rod (43) is a spherical connecting rod.
4. The teaching-free programming building component welding device based on three-dimensional reconstruction as claimed in claim 3, characterized in that: the support (41) comprises a base and a cylindrical supporting rod, the cylindrical supporting rod is fixed on the base, and the connecting rod (43) is arranged at the upper end of the cylindrical supporting rod.
5. The teaching-free programming building component welding device based on three-dimensional reconstruction as recited in any of claims 2 to 4, characterized in that: the distances between the three depth cameras (44) and the ground are 1-2 m.
6. The teaching-free programming building component welding device based on three-dimensional reconstruction as recited in claim 5, wherein: the distances between the three depth cameras (44) and the ground are 1.5 meters.
7. Method for the teaching-free programming of a welding device for building elements based on three-dimensional reconstruction according to claim 6, characterized in that it comprises the following steps:
and (3) environment construction: installing a visual reconstruction module (4), taking a welding component (5) as a central object, placing the welding component in a triangular shape, and installing three infrared depth cameras (44);
calibrating a depth camera: calibrating a depth camera of the small-hole imaging model by adopting a Zhang Zhengyou calibration method; generating black and white checkerboards by using OpenCV, printing and manufacturing the checkerboards into calibration boards, respectively placing the calibration boards at different positions and different angles, and sequentially shooting by using a depth camera; performing corner detection on the shot picture, extracting sub-pixel corner coordinate values and calculating three-dimensional coordinates of the extracted corners to obtain an internal reference matrix and distortion parameters of the camera to finish camera calibration;
calibrating the hands and eyes: the method comprises the steps that a depth camera shoots a component to be welded to obtain the relative position of the component to be welded in a camera coordinate system; the relative position of the mechanical arm base and the camera is kept unchanged, a conversion matrix between a mechanical arm base coordinate system and a camera coordinate system is solved, and hand-eye calibration is carried out in a mode that eyes are outside the hands;
mechanical arm kinematics analysis: describing the motion relation of the mechanism by using link parameters, wherein the parameters comprise link length, link corner, link offset and joint angle, and describing the mechanical arm mathematical model by using DH parameters; solving positive and inverse motion of the mechanical arm by using DH parameters, and establishing a transformation matrix of an end effector coordinate system relative to a base coordinate system to form a positive and inverse kinematics equation of the mechanical arm;
multi-angle point cloud splicing: three cameras finish multi-angle point cloud data acquisition of a component to be welded, and point cloud splicing is carried out through a point cloud registration method; registering and splicing the point clouds by using an iterative closest point algorithm to obtain point cloud data of a complete welding component, and finishing three-dimensional reconstruction;
weld extraction and generation of spatial paths: carrying out straight-through filtering on the reconstructed point cloud, setting parameters in the x, y and z directions according to straight-through filtering parameters and the installation position of the depth camera, ensuring that the collected component point cloud is contained in a parameter limit range, and taking out more background point cloud information; further calculating the most values of the point cloud after the straight-through filtering in the x direction, the y direction and the z direction, respectively giving the sizes of voxels in the x direction, the y direction and the z direction, and performing down-sampling on divided voxel small squares by adopting voxel down-sampling taking an average value; further, randomly adopting a consistency method to complete the segmentation and the extraction of the point clouds of the component and the ground, and taking the point cloud data of the area with changed vector of the component extraction method as a welding line point; and further fitting the extracted welding line point data by using RANSAC to obtain a space straight line point of the welding line, and obtaining a welding starting point and a welding finishing point, wherein the point on the straight line is the position of the tail end of the welding robot for moving welding, and the space path of the movement of the mechanical arm is obtained by using the coordinate conversion relation between the point cloud coordinate system and the robot coordinate system obtained by hand-eye calibration.
8. The method of claim 7, wherein: in the environment building step, each depth camera is aligned to a component to be welded and ensures a shooting object, the initial position of the mechanical arm, namely the shooting reconstruction position of the depth camera, is determined, the component is not shielded by the mechanical arm when shooting is ensured, and the visibility of the component is ensured.
9. The method of claim 7, wherein: in the step of calibrating the depth camera, the image heads of the depth camera are used for shooting in sequence, and 10 pictures are shot in total.
CN202210551258.5A 2022-05-20 2022-05-20 Teaching-free programming building component welding device and method based on three-dimensional reconstruction Pending CN114888501A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210551258.5A CN114888501A (en) 2022-05-20 2022-05-20 Teaching-free programming building component welding device and method based on three-dimensional reconstruction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210551258.5A CN114888501A (en) 2022-05-20 2022-05-20 Teaching-free programming building component welding device and method based on three-dimensional reconstruction

Publications (1)

Publication Number Publication Date
CN114888501A true CN114888501A (en) 2022-08-12

Family

ID=82723056

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210551258.5A Pending CN114888501A (en) 2022-05-20 2022-05-20 Teaching-free programming building component welding device and method based on three-dimensional reconstruction

Country Status (1)

Country Link
CN (1) CN114888501A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117047237A (en) * 2023-10-11 2023-11-14 太原科技大学 Intelligent flexible welding system and method for special-shaped parts

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117047237A (en) * 2023-10-11 2023-11-14 太原科技大学 Intelligent flexible welding system and method for special-shaped parts
CN117047237B (en) * 2023-10-11 2024-01-19 太原科技大学 Intelligent flexible welding system and method for special-shaped parts

Similar Documents

Publication Publication Date Title
CN110524580B (en) Welding robot vision assembly and measuring method thereof
CN107883929B (en) Monocular vision positioning device and method based on multi-joint mechanical arm
US8406923B2 (en) Apparatus for determining pickup pose of robot arm with camera
JP7153085B2 (en) ROBOT CALIBRATION SYSTEM AND ROBOT CALIBRATION METHOD
CN109719438B (en) Automatic tracking method for welding seam of industrial welding robot
CN113276106B (en) Climbing robot space positioning method and space positioning system
CN110450163A (en) The general hand and eye calibrating method based on 3D vision without scaling board
CN114434059B (en) Automatic welding system and method for large structural part with combined robot and three-dimensional vision
CN112958959A (en) Automatic welding and detection method based on three-dimensional vision
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
CN114289934B (en) Automatic welding system and method for large structural part based on three-dimensional vision
CA2799042A1 (en) Method and system for generating instructions for an automated machine
CN114043087B (en) Three-dimensional trajectory laser welding seam tracking attitude planning method
KR102096897B1 (en) The auto teaching system for controlling a robot using a 3D file and teaching method thereof
CN113146620A (en) Binocular vision-based double-arm cooperative robot system and control method
CN112577447B (en) Three-dimensional full-automatic scanning system and method
CN111975200A (en) Intelligent welding method and intelligent welding system based on visual teaching technology
CN110039520B (en) Teaching and processing system based on image contrast
CN106737859A (en) The method for calibrating external parameters of sensor and robot based on invariable plane
CN114888501A (en) Teaching-free programming building component welding device and method based on three-dimensional reconstruction
CN112958974A (en) Interactive automatic welding system based on three-dimensional vision
CN116542914A (en) Weld joint extraction and fitting method based on 3D point cloud
CN117047237B (en) Intelligent flexible welding system and method for special-shaped parts
CN112598752A (en) Calibration method based on visual identification and operation method
CN114800574B (en) Robot automatic welding system and method based on double three-dimensional cameras

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination