CN111673749B - Adjusting method of visual welding robot and visual welding robot - Google Patents

Adjusting method of visual welding robot and visual welding robot Download PDF

Info

Publication number
CN111673749B
CN111673749B CN202010515894.3A CN202010515894A CN111673749B CN 111673749 B CN111673749 B CN 111673749B CN 202010515894 A CN202010515894 A CN 202010515894A CN 111673749 B CN111673749 B CN 111673749B
Authority
CN
China
Prior art keywords
welding
robot
visual
camera
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010515894.3A
Other languages
Chinese (zh)
Other versions
CN111673749A (en
Inventor
吕洁印
周受钦
李继春
刘海林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen CIMC Intelligent Technology Co Ltd
Original Assignee
China International Marine Containers Group Co Ltd
Shenzhen CIMC Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China International Marine Containers Group Co Ltd, Shenzhen CIMC Intelligent Technology Co Ltd filed Critical China International Marine Containers Group Co Ltd
Priority to CN202010515894.3A priority Critical patent/CN111673749B/en
Publication of CN111673749A publication Critical patent/CN111673749A/en
Application granted granted Critical
Publication of CN111673749B publication Critical patent/CN111673749B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • B23K37/02Carriages for supporting the welding or cutting element
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The embodiment of the application discloses an adjustment method of a visual welding robot and the visual welding robot, wherein the visual welding robot is provided with a camera, and the method comprises the following steps: synchronizing a control coordinate system of a control system of the visual welding robot with a basic coordinate system of the visual welding robot; acquiring an image of a to-be-welded part acquired by a camera to extract a welding seam track in the image of the to-be-welded part and extract characteristic points of the welding seam track; obtaining coordinates of the characteristic points of the welding seam track in a robot base coordinate system; controlling the camera to move along the welding seam track according to the welding seam track and the coordinates of the characteristic points in the robot base coordinate system; and adjusting the position of the camera of the visual welding robot according to the picture shot by the camera moving along the welding seam track. The technical method can ensure that the robot body and related parts are not damaged in the debugging process when the vision welding robot is automatically adjusted and calibrated, and also ensures the safety of debugging personnel.

Description

Adjusting method of visual welding robot and visual welding robot
Technical Field
The present disclosure relates to the field of robot calibration technologies, and in particular, to a method and an apparatus for calibrating a visual welding robot, an electronic device, and a computer-readable storage medium.
Background
At present, a robot calibration technology does not form a uniform operation standard, so that the problem that a robot body and a control system are damaged in a debugging process may occur in the process of calibrating a robot.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present application and therefore may include information that does not constitute prior art known to a person of ordinary skill in the art.
Disclosure of Invention
In order to solve the above problems, embodiments of the present application provide a method and an apparatus for adjusting a visual welding robot, an electronic device, and a computer-readable storage medium.
Wherein, the adjustment method of a vision welding robot that this application adopted is:
synchronizing a control coordinate system of a control system of the visual welding robot with a basic coordinate system of the visual welding robot; acquiring an image of a to-be-welded part acquired by a camera to extract a welding seam track in the image of the to-be-welded part and extract characteristic points of the welding seam track; obtaining coordinates of the characteristic points of the welding seam track in a robot base coordinate system; controlling the camera to move along the welding seam track according to the welding seam track and the coordinates of the characteristic points in the robot base coordinate system; and adjusting the position of the camera of the visual welding robot according to the picture shot by the camera moving along the welding seam track.
In another embodiment, based on the foregoing solution, synchronizing a control coordinate system of a control system of a visual welding robot with a base coordinate system of the visual welding robot comprises: detecting a position signal of the visual welding robot; if the preset position signal is detected, determining that the visual welding robot returns to a mechanical origin, wherein the mechanical origin is the origin of a basic coordinate system of the robot, and the preset position signal is the position signal when the visual welding robot returns to the mechanical origin; setting the control coordinate system when the visual welding robot is determined to return to the mechanical origin so that the control coordinate system is synchronized with the robot base coordinate system.
In another embodiment, based on the foregoing solution, the controlling the camera to move along the weld track according to the weld track and the coordinates of the feature points in the robot-based coordinate system includes: taking the positions of the characteristic points as the origin of a workpiece coordinate system, and acquiring coordinate data of each point of the welding seam track in the workpiece coordinate system; and controlling the camera to move along the welding seam track according to the coordinates of the characteristic points in the robot base coordinate system and the coordinate data of each point of the welding seam track in the workpiece coordinate system.
In a further embodiment, based on the foregoing, the visual welding robot has a welding gun, and the method further comprises: controlling the camera to move to shoot the feature points; when the feature point is positioned at the center of a picture shot by the camera, recording the coordinates of the camera in a robot base coordinate system; controlling the welding gun to move so that the welding gun moves to a position away from the characteristic point by a preset arcing distance; obtaining the coordinates of the welding gun in a robot base coordinate system; and determining the offset distance between the welding gun and the camera according to the camera and the coordinates of the welding gun in the robot base coordinate system.
In a further embodiment, based on the foregoing, the visual welding robot has a plurality of axes for controlling the welding torch to move, and before controlling the welding torch to move to a position away from the feature point by a preset arcing distance, the method further includes: debugging the movement of the welding gun, and controlling a plurality of shafts of the visual welding robot to operate at a first speed limit in a first time period after the debugging is started; and after the first time length is reached, controlling a plurality of shafts of the visual welding robot to operate at a second speed limit, wherein the second speed limit is greater than the first speed limit.
In a further embodiment, based on the foregoing scheme, the method further includes: controlling a welding gun to perform simulated copying welding according to the welding seam track to obtain the movement track of the welding gun; comparing the movement track of the welding gun with the welding seam track to obtain a first comparison result; controlling a welding gun to weld according to the welding seam track, and controlling the welding track of the welding gun to be shot by keeping the offset distance between the camera and the welding gun; obtaining a second comparison result according to the welding track and the welding seam track; and determining an adjustment result for the visual welding robot according to the first comparison result and the second comparison result, determining that the adjustment result for the visual welding robot is successful if the first comparison result is matched with the second comparison result, and determining that the adjustment result for the visual welding robot is failed if any one of the first comparison result and the second comparison result is not matched.
In another embodiment, based on the foregoing solution, after the debugging the movement of the welding gun, the method further includes: confirming whether the motion of each axis of the vision welding robot is stable or not and confirming whether the motion trail of the welding gun is smooth or not; when the stable movement of each shaft and the smooth movement track of the welding gun are determined, planning a welding process and a welding path of the welding gun according to the welding track and the offset distance; and if the planning of the welding process and the welding path meets the preset conditions, determining that the debugging results of the welding process and the welding path of the visual welding robot are successful.
In another embodiment, based on the foregoing scheme, before the obtaining of the image of the to-be-welded part acquired by the camera, the method further includes: shooting a plurality of images of the to-be-welded parts; and debugging the camera according to the image information of the images of the plurality of to-be-welded parts, wherein the debugging of the camera comprises one or more of determining exposure time, determining sampling frequency, selecting an image processing strategy and selecting a characteristic extraction strategy.
In another embodiment, based on the foregoing scheme, before capturing the images of the plurality of workpieces to be welded, the method further includes: confirming whether the robot body of the visual welding robot is deformed or not, and confirming whether an electrical control installation connecting line of the visual welding robot is correct or not and whether the electrical control installation connecting line is loosened or not; and if determining that the robot body is not deformed and the electrical control installation connecting wire is correct and is not loosened, shooting images of various to-be-welded parts.
Based on another aspect of this application, this application still provides a vision welding robot, its characterized in that includes: a robot body; the robot body has a plurality of axes; the camera mechanism is provided with a camera and is connected with the robot body; the welding equipment is connected with the robot body and comprises a welding gun; and the controller is connected with the robot body, the camera mechanism and the welding equipment and is used for controlling the camera mechanism and the welding equipment to execute the steps of the adjusting and calibrating method of the visual welding robot.
On the other hand based on this application, this application still provides a calibration device of vision welding robot, and this vision welding robot has the camera, includes:
the synchronous unit is used for synchronizing a control coordinate system of the visual welding robot control system with a basic coordinate system of the visual welding robot;
the extraction unit is used for acquiring an image of the to-be-welded part acquired by the camera so as to extract a welding seam track in the image of the to-be-welded part and extract characteristic points of the welding seam track;
the acquisition unit is used for acquiring the coordinates of the characteristic points of the welding seam track in a robot base coordinate system;
the control unit is used for controlling the camera to move along the welding seam track according to the welding seam track and the coordinates of the characteristic points in the robot base coordinate system;
and the adjusting unit is used for adjusting the position of the camera of the visual welding robot according to the picture shot by the camera moving along the welding seam track.
Based on another aspect of the application, the application also provides
An adjusting apparatus of a visual welding robot comprises a processor and a memory, wherein the memory is stored with computer readable instructions, and the computer readable instructions are executed by the processor to realize the adjusting method of the visual welding robot.
A computer readable storage medium having stored thereon computer readable instructions which, when executed by a processor of a computer, cause the computer to execute the tuning method of a visual welding robot as described above.
The technical scheme provided by the application embodiment can have the following beneficial effects:
the technical method can ensure that the robot body and related parts are not damaged in the debugging process when the vision welding robot is automatically adjusted and calibrated, and also ensures the safety of debugging personnel.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
fig. 1 is a flow chart illustrating a method of tuning a visual welding robot in accordance with an exemplary embodiment;
FIG. 2 is a flow chart of step S110 in another embodiment of the embodiment shown in FIG. 1;
FIG. 3 is a flowchart of step S140 in another embodiment of the embodiment shown in FIG. 1;
FIG. 4 is a flow chart illustrating a method of tuning a visual welding robot in accordance with another exemplary embodiment;
FIG. 5 is a flow chart illustrating a method of tuning a visual welding robot in accordance with another exemplary embodiment;
fig. 6 is a flow chart illustrating a method of tuning a visual welding robot in accordance with another exemplary embodiment;
fig. 7 is a flow chart illustrating a method of tuning a visual welding robot in accordance with another exemplary embodiment;
fig. 8 is a flow chart illustrating a method of tuning a visual welding robot in accordance with another exemplary embodiment;
fig. 9 is a flow chart illustrating a method of tuning a visual welding robot in accordance with another exemplary embodiment;
fig. 10 is a flow chart of a method of tuning a visual welding robot according to one embodiment of the present application;
fig. 11 is a block diagram illustrating a tuning apparatus 1100 of a visual welding robot according to an exemplary embodiment;
fig. 12 is a schematic diagram of a hardware configuration of a tuning apparatus of a visual welding robot according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Fig. 1 is a flow chart illustrating a method for tuning a visual welding robot according to an exemplary embodiment. As shown in fig. 1, in an exemplary embodiment, the tuning method of the visual welding robot may include the following steps S110 to S150.
It should be noted that the calibration process for the vision welding robot includes the adjustment and calibration of the camera, and the calibration of the camera can be realized through the following steps S110 to S150.
And step S110, synchronizing a control coordinate system of the visual welding robot control system with a basic coordinate system of the visual welding robot.
Coordinate systems, a plane or space defined by axes from a fixed point at an origin. The control system of the vision welding robot measures and positions through the respective axes of the coordinate system, the control of the vision welding robot has a plurality of coordinate systems, wherein the control coordinate system is the coordinate system used by the control system, which can be set in synchronization with the base coordinate system of the vision welding robot, which is the coordinate system located at the base of the vision welding robot, facilitating the movement of the vision welding robot from one position to another.
Through the control coordinate system with vision welding robot control system and the basic coordinate system synchronization of vision welding robot, can realize that this vision welding robot's of more convenient control each part moves.
And step S120, acquiring an image of the to-be-welded part acquired by the camera to extract a welding seam track in the image of the to-be-welded part and extracting characteristic points of the welding seam track.
The weldment to be welded is a workpiece waiting for the weldment and having a weld track thereon for indicating the points or lines where the weldment is needed.
The image of the to-be-welded part can be collected through the camera to extract the welding track in the image, in another embodiment, the image of the to-be-welded part can be obtained through the sensor, when the collected image of the to-be-welded part is obtained, the image is preprocessed through the image processing assembly, for example, image binarization, median filtering, image morphology refinement and the like, and then the preprocessed image is sent to the controller of the visual welding robot control system to be processed.
The characteristic point of the weld track may be defined in advance, for example, the plane coordinate of the characteristic point is the coordinate of the first turning point of the weld centerline, and the characteristic point coordinate of the T-fillet weld is the coordinate of the central extension line of the right stripe of the two broken stripes or the coordinate of the first point where the left stripe contacts, which is not limited herein.
It should be understood that, from the weld trajectory in the acquired image and the feature points of the weld trajectory, the positional relationship of the weld trajectory feature points and the visual welding robot can be obtained.
In the present embodiment, the extracted feature points of the weld trajectory, and the positional relationship implied therein can be used to adjust the position of the camera of the visual welding robot.
And step S130, obtaining the coordinates of the characteristic points of the welding seam track in the robot base coordinate system.
As previously mentioned, the base coordinate system is the coordinate system located at the base of the vision welding robot. The positional relationship between the characteristic point of the weld trajectory and the visual welding robot can be obtained through step S220. Then, it is easy to understand that the coordinates of the feature points of the weld trajectory in the robot-based coordinate system can be obtained.
And step S140, controlling the camera to move along the welding seam track according to the welding seam track and the coordinates of the characteristic points in the robot base coordinate system.
And obtaining the coordinates of all the points of the welding seam track in the basic coordinate system of the visual welding robot according to the coordinates of the characteristic points of the welding seam track and the image of the welding seam track. And controlling the camera mounted on the visual welding robot to move along the welding seam track according to the obtained coordinates of all points of the welding seam track in the basic coordinate system of the visual welding robot.
And S150, adjusting the position of the camera of the visual welding robot according to the picture shot by the camera moving along the welding seam track.
The camera automatic correction based on binocular vision is that the camera of the vision welding robot in teaching automatically adjusts the position of the camera according to the position of the shot welding seam track in the shot picture.
In the process of adjusting the visual welding robot, if an abnormality occurs, the power supply can be forcibly turned off, and the abnormality can be preset by a debugger and detected by sensors installed at various portions of the visual welding robot.
Therefore, the position of the camera of the robot is automatically adjusted and calibrated through the control system of the visual welding robot, the safety of debugging personnel can be guaranteed, the robot body and the control system are guaranteed not to be damaged in the debugging process, and labor cost is saved.
Fig. 2 is a flowchart of step S110 in another embodiment of the embodiment shown in fig. 1, in which the control coordinate system of the visual welding robot control system is synchronized with the base coordinate system of the visual welding robot, including steps S210 to S230.
Step S210, a position signal of the visual welding robot is detected.
The position signal may be a switching signal preset when the visual welding robot returns to the mechanical origin.
Step S220, if the preset position signal is detected, it is determined that the visual welding robot returns to a mechanical origin, the mechanical origin is an origin of a basic coordinate system of the robot, and the preset position signal is a position signal when the visual welding robot returns to the mechanical origin.
When the switching signal is detected, it can be determined that the visual welding robot has returned to the mechanical origin, and the robot can be better controlled based on the mechanical origin.
And step S230, setting a control coordinate system when the visual welding robot is determined to return to the mechanical origin so as to synchronize the control coordinate system with the robot base coordinate system.
When it is determined that the visual welding robot has returned to the mechanical origin, the origin of the control coordinate system of the control system may be set to the position of the mechanical origin, and thus, synchronization of the control coordinate system and the base coordinate system may be achieved.
Fig. 3 is a flowchart of another embodiment of step S140 in the embodiment shown in fig. 1, in which step S140 controls the camera to move along the weld track according to the weld track and the coordinates of the feature points in the robot base coordinate system, and may include the following steps S310 and S320.
And step S310, taking the position of the characteristic point as the origin of the workpiece coordinate system, and acquiring coordinate data of each point of the welding seam track in the workpiece coordinate system.
The workpiece coordinate system is a coordinate system formed by a workpiece origin and coordinate axes, and the coordinate data of the workpiece coordinate system is the coordinate data of the workpiece coordinate system relative to a base coordinate system.
And step S320, controlling the camera to move along the welding seam track according to the coordinates of the characteristic points in the robot base coordinate system and the coordinate data of each point of the welding seam track in the workpiece coordinate system.
Therefore, the welding seam track can be shot through the camera, a plurality of pictures are formed, and the position of the camera is automatically adjusted by analyzing the position of the characteristic point of the welding seam track in each picture.
Fig. 4 is a flowchart illustrating a tuning method of a visual welding robot according to another exemplary embodiment. As shown in fig. 4, in an exemplary embodiment, the visual welding robot has a welding gun, and the tuning method of the visual welding robot may further include the following steps S410 to S450.
Step S410, controlling the camera to move to shoot the feature points;
step S420, when the feature point is located at the center of the picture shot by the camera, recording the coordinates of the camera in a robot base coordinate system;
step S430, controlling the welding gun to move so that the welding gun moves to a position which is away from the characteristic point by a preset arc starting distance;
step S440, obtaining coordinates of the welding gun in a robot base coordinate system;
and step S450, determining the offset distance between the welding gun and the camera according to the camera and the coordinates of the welding gun in the robot base coordinate system.
The adjustment process of the vision welding robot further comprises the step of setting the offset distance between the welding gun and the camera. Specifically, the position of the weld joint feature point is used as a reference point, the position of the camera is adjusted, the camera is enabled to move within a proper range, the feature point is finally located at the center of a picture shot by the camera, and after the position of the camera is determined, the welding gun is moved, and the distance from the welding gun to the feature point is a preset arc starting distance. Therefore, the offset distance between the welding gun and the camera is obtained, so that the camera can shoot the welding process based on the structured light principle.
In another embodiment, the feature point may also be located at the leftmost side or the rightmost side of the picture taken by the camera, and may be specifically set by the adjuster according to specific situations, which is not limited herein.
It should be noted that the preset arcing distance, the arcing voltage and the arcing current all affect the welding effect of the visual welding robot. Thus, in a further embodiment, the tuning process of the visual welding robot may further comprise setting of arcing parameters, including arcing distance, arcing voltage and arcing current.
Fig. 5 is a flowchart illustrating a tuning method of a visual welding robot according to another exemplary embodiment, based on the tuning method of a visual welding robot illustrated in fig. 4, the visual welding robot having a plurality of axes for controlling a welding gun to move, the method further including steps S510 and S520 before step S430.
Step S510, debugging the movement of the welding gun, and controlling a plurality of shafts of the visual welding robot to operate at a first speed limit in a first time length after the debugging is started;
and S520, after the first time length is reached, controlling a plurality of shafts of the visual welding robot to operate at a second speed limit, wherein the second speed limit is greater than the first speed limit.
The vision welding robot has a plurality of axes, that is, the vision welding robot is a multi-axis robot, which is a multipurpose manipulator capable of realizing automatic control, repeatable programming, multiple degrees of freedom, and freedom of movement to build a spatial right-angle relationship. The multi-axis robot is classified into a 4-axis robot, a 5-axis robot, a 6-axis robot, and the like according to the number of axes of the manipulator. I.e. free to move in 4, 5 or 6 directions, respectively.
The tuning process for the vision welding robot also includes tuning the motion of the various axes.
Specifically, at the initial stage of the motion debugging of each shaft, namely at the first time after the start of the debugging, each shaft is controlled to operate at the first speed limit, and along with the continuous debugging and the continuous optimization of the debugging result, the speed of each shaft can be gradually increased to the second speed limit. Therefore, the safety of equipment can be improved, the safety of debugging personnel is ensured, and the robot body and the control system hardware are not damaged in the debugging process.
Fig. 6 is a flowchart illustrating a tuning method of a visual welding robot according to another exemplary embodiment, and on the basis of the tuning method of a visual welding robot illustrated in fig. 5, in this embodiment, steps S610 to S650 may be further included.
S610, controlling a welding gun to perform simulated copying welding according to the welding seam track to obtain the movement track of the welding gun;
step S620, comparing the movement track and the welding seam track of the welding gun to obtain a first comparison result;
step S630, controlling a welding gun to weld according to the welding seam track, and controlling the welding track of the welding gun to be shot by keeping the offset distance between the camera and the welding gun;
step S640, obtaining a second comparison result according to the welding track and the welding seam track;
and step S650, determining an adjustment result for adjusting the vision welding robot according to the first comparison result and the second comparison result, determining that the adjustment result for the vision welding robot is successful if the first comparison result and the second comparison result are both matched, and determining that the adjustment result for the vision welding robot is failed if any one of the first comparison result and the second comparison result is not matched.
The tuning process for the vision welding robot also includes performing a welding test. In order to better observe the influence of the running condition of the visual welding robot in the running process and the external environment in the welding process on the performance of the welding robot, the welding test process is divided into two stages. The method comprises the steps of firstly, controlling a welding gun to perform simulated profiling welding according to a welding seam track to obtain a movement track of the welding gun, wherein the movement track is obtained by comparing the movement track with the welding seam track under the condition of no arc striking, the movement track of the welding gun simulated welding can be obtained through a position sensor arranged on the welding gun or any other mode, a first comparison result is obtained, the control condition of the control system on the welding gun can be obtained through analysis of the comparison result, if the searched value of the movement track and the welding seam track is smaller than a preset value, the control system can be determined to accurately control the welding gun, and therefore, the control effect of the control system on the welding can be obtained through the profiling welding, and welding flux can be saved. The second stage is actual welding process, through the camera record actual welding back, the adjustment result of camera both can be verified to the welding track of production, also can verify actual welded effect, can realize the test to visual welding robot welding process, can also verify through the test result and adjust the effect to this visual welding robot.
Fig. 7 is a flowchart illustrating a tuning method of a visual welding robot according to another exemplary embodiment, and after tuning the movement of the welding gun in step S510, the tuning method of a visual welding robot illustrated in fig. 5 may further include the following steps:
step S710, confirming whether the motion of each axis of the vision welding robot is stable and confirming whether the motion track of the welding gun is smooth;
s720, after the stable movement of each shaft and the smooth movement track of the welding gun are determined, planning a welding process and a welding path of the welding gun according to the welding seam track and the offset distance;
step S730, if the planning of the welding process and the welding path satisfies the preset condition, determining that the debugging result of the welding process and the welding path of the visual welding robot is successful.
In this embodiment, based on the foregoing method, after the movement of the welding gun is debugged, the debugging process of the visual welding robot may further include determining whether each axis of the visual welding robot moves smoothly and determining whether the movement track of the welding gun moves smoothly, and after determining that each axis of the visual welding robot moves smoothly and the movement track of the welding gun moves smoothly, parameters may be manually selected through a control panel of the control system, or data such as a type of a welding seam and a process are obtained according to a sensor, and a corresponding module in the control system is used to perform image processing, so as to implement planning of a welding process and a welding path.
Fig. 8 is a flowchart illustrating a tuning method of a visual welding robot according to another exemplary embodiment, which may further include the following steps before step S120 on the basis of the tuning method of a visual welding robot illustrated in fig. 1:
step S810, shooting a plurality of images of the to-be-welded part;
step S820, debugging the camera according to the image information of the images to be welded, wherein the debugging of the camera comprises one or more of determining exposure time, determining sampling frequency, selecting an image processing strategy and selecting a characteristic extraction strategy.
The debugging of the visual welding robot further comprises debugging of shooting parameters of the camera, wherein the shooting parameters comprise one or more of exposure time, sampling frequency, image processing strategies and feature extraction strategies. Specifically, images of the to-be-welded part can be shot by adopting different exposure times or by adopting different sampling frequencies or different exposure times and frequency combinations, the shot images are transmitted to a control system, the control system determines the final exposure time and sampling frequency according to image quality standards, and then different image processing strategies and feature extraction strategies are used for image processing. And determining each shooting parameter of the camera according to the characteristics of different pieces to be welded, the brightness of the environment and the like.
Fig. 9 is a flowchart illustrating a tuning method of a visual welding robot according to another exemplary embodiment, which may further include the following steps before step S810 on the basis of the tuning method of a visual welding robot illustrated in fig. 8:
step S910, confirming whether the robot body of the vision welding robot is deformed or not, and confirming whether the electrical control installation connecting line of the vision welding robot is correct or not and whether the electrical control installation connecting line is loosened or not;
and step S920, if the robot body is determined to be not deformed and the electric control installation connecting line is determined to be correct and not to be loosened, shooting images of various to-be-welded parts.
In this embodiment, before shooting a plurality of images of the workpiece to be welded, the debugging process for the visual welding robot may further include determining whether the robot body of the visual welding robot is deformed or not and confirming whether the electrical control mounting connection of the visual welding robot is correct or not and whether the electrical control mounting connection is loosened or not, and if it is determined that the robot body is not deformed and the electrical control mounting connection is correct or not, shooting a plurality of images of the workpiece to be welded is started.
For ease of understanding, the inventive concept of the present application will be described below with reference to a specific tuning procedure of a visual welding robot.
Fig. 10 is a flowchart of a tuning method of a visual welding robot according to an embodiment of the present application. As shown in fig. 10, in this embodiment, the calibration process of the visual welding robot sequentially includes a robot body debugging 1001, an electrical control installation debugging 1002, a camera acquisition debugging 1003, setting working stroke ranges 1004 of respective axes, setting an axis direction, a pulse vector 1005, returning to a mechanical origin 1006, a camera follow-up profiling and centering debugging 1007, setting an offset distance 1008 between the camera and the welding gun, adjusting speeds 1009 of the respective axes, smoothing a welding gun track 1010, planning a welding gun path welding process 1011, setting an arc starting voltage, an arc starting current and an arc starting distance 1012, and a welding test 1013.
The robot body debugging 1001 comprises the steps of manufacturing mechanical parts strictly according to scheme design, simulation analysis and mechanical assembly paper, assembling the parts, ensuring correct and firm assembly without part shortage, avoiding the conditions of impact deformation, damage and the like in the assembling process, checking whether lubrication of screw rods, sliding rails, racks and the like of all parts of a body is proper, checking whether a transmission screw rod is blocked or not, and ensuring that the motion direction of each shaft is not obstructed. If the installation is determined to be correct. And if the robot body is not deformed, the next step is carried out.
The electrical control installation and debugging 1002 comprises checking whether the installation connection of each electrical element is correct, whether the connection is loose, and whether normal communication can be performed between each element; whether the connection line of the electric circuit is normal or not, whether the installation and the performance of each safety limit are safe and reliable or not, and whether the electric output and the setting are normal or not are checked; the inspection terminal board input signal instructs the LED lamp, interferes the original point switch through the simulation, if corresponding LED lamp state changes, explains on the original point signal has sent the terminal board, ensures from the terminal board to welding robot drive control, spacing, wiring at zero point correct, can shorten the debugging time greatly. Whether the transmission of signals of each port is normal or not is determined through an upper computer software port online debugging module, the signal is effective when the dot is green, the signal is ineffective (no input or output) when the dot is red, and if each electrical element is installed without errors or looseness, the next step is carried out.
The camera acquisition and debugging 1003 comprises the steps of debugging exposure time, sampling frequency, image processing method and characteristic proposing strategy, so that the obtained welding seam information is more accurate and is better applied to welding control. And after the setting is finished, entering the next step.
Setting the range of the operation stroke of each axis 1004 includes setting the range of the operation stroke of each axis based on the actual size of the vision welding robot to use the soft limit function. And after the setting is finished, entering the next step.
Setting an axis direction and a pulse vector 1005, wherein the axis direction and the pulse vector 1005 comprise moving each axis of the robot, determining whether the motion direction of each axis is correct, if the motion direction is incorrect, modifying the axis direction 1 or-1 through an upper computer, and setting related parameters of a servo driver; the smaller the pulse equivalent, the higher the resolution of the control and the value of the pulse equivalent directly affects the maximum feed speed.
Returning to the mechanical origin 1006, including, if a preset position signal is detected, determining that the visual welding robot has returned to the mechanical origin, the mechanical origin being the origin of the robot base coordinate system, the preset position signal being the position signal of the visual welding robot when returning to the mechanical origin.
The camera follow-up profiling and centering debugging 1007 comprises the steps of acquiring an image of a to-be-welded part acquired by a camera to extract a welding seam track in the image of the to-be-welded part and extracting characteristic points of the welding seam track; obtaining coordinates of the characteristic points of the welding seam track in a robot base coordinate system; controlling the camera to move along the welding seam track according to the welding seam track and the coordinates of the characteristic points in the robot base coordinate system; and adjusting the position of the camera of the visual welding robot according to the picture shot by the camera moving along the welding seam track.
Setting an offset distance 1008 between the camera and the welding gun, including controlling the camera to move to shoot the feature points; when the feature point is positioned at the center of a picture shot by the camera, recording the coordinates of the camera in a robot base coordinate system; controlling the welding gun to move so that the welding gun moves to a position away from the characteristic point by a preset arcing distance; obtaining the coordinates of the welding gun in a robot base coordinate system; and determining the offset distance between the welding gun and the camera according to the camera and the coordinates of the welding gun in the robot base coordinate system.
Adjusting the speed of each axis 1009, including debugging the motion of the welding gun, and controlling a plurality of axes of the visual welding robot to operate at a first speed limit in a first time period after the debugging is started; and after the first time length is reached, controlling a plurality of shafts of the visual welding robot to operate at a second speed limit, wherein the second speed limit is greater than the first speed limit.
And smoothing 1010 the welding gun track, wherein an initial welding gun path is set according to the specification of the welding seam track, and the initial welding gun path is optimized through a smoothing strategy so as to set the welding gun track meeting the smoothing condition.
The welding gun path welding process planning 1011 includes planning and adjusting the welding path and the welding process of the welding gun according to the characteristics of the welding seam, such as the width of the seam, the thermal deformation, the coordinate distance and other factors.
The arc starting voltage, arc starting current and arc starting distance are given 1012, which includes debugging on the basis of the arc starting voltage, arc starting current and arc starting distance given by the control system, so as to determine the final arc starting voltage, arc starting current and arc starting distance according to actual conditions.
The welding test 1013 comprises controlling a welding gun to perform simulated profiling welding according to a welding seam track to obtain a movement track of the welding gun; comparing the movement track of the welding gun with the welding seam track to obtain a first comparison result; controlling a welding gun to weld according to the welding seam track, and controlling the welding track of the welding gun to be shot by keeping the offset distance between the camera and the welding gun; obtaining a second comparison result according to the welding track and the welding seam track; and determining a calibration result for calibrating the visual welding robot according to the first comparison result and the second comparison result.
And if the first comparison result and the second comparison result are both matched, determining that the adjustment result of the visual welding robot is successful, and if any one of the first comparison result and the second comparison result is not matched, determining that the adjustment result of the visual welding robot is failed.
Therefore, the safety of debugging personnel can be guaranteed, the robot body and control system hardware are not damaged in the debugging process, welding materials are saved, the cost is reduced, and the safety of equipment is enhanced in the aspects of slow-to-fast movement speed and the like.
An embodiment of the present application further shows a visual welding robot, which includes: a robot body; the robot body has a plurality of axes; the camera mechanism is provided with a camera and is connected with the robot body; the welding equipment is connected with the robot body and comprises a welding gun; and the controller is connected with the robot body, the camera mechanism and the welding equipment and is used for controlling the camera mechanism and the welding equipment to execute the steps of the adjusting and calibrating method of the visual welding robot.
Fig. 11 is a block diagram illustrating a setup apparatus 1100 of a visual welding robot according to an exemplary embodiment, which includes a synchronization unit 1101, an extraction unit 1102, an obtaining unit 1103, a control unit 1104, and a setup unit 1105, as shown in fig. 11.
A synchronization unit 1101 for synchronizing a control coordinate system of the visual welding robot control system with a base coordinate system of the visual welding robot;
the extraction unit 1102 is used for acquiring an image of the to-be-welded part acquired by the camera, so as to extract a weld track in the image of the to-be-welded part and extract feature points of the weld track;
an obtaining unit 1103, configured to obtain coordinates of feature points of the weld trajectory in a robot-based coordinate system;
the control unit 1104 is used for controlling the camera to move along the welding seam track according to the welding seam track and the coordinates of the characteristic points in the robot base coordinate system;
the adjusting unit 1105 is configured to adjust a position of the camera of the visual welding robot according to a picture taken by the camera moving along the welding seam track.
It should be noted that the apparatus provided in the foregoing embodiment and the method provided in the foregoing embodiment belong to the same concept, and the specific manner in which each module and unit execute operations has been described in detail in the method embodiment, and is not described again here.
In another exemplary embodiment, the present application further provides a tuning apparatus of a visual welding robot, including a processor and a memory, wherein the memory has stored thereon computer readable instructions, which when executed by the processor, implement the tuning method of the visual welding robot as described above.
Referring to fig. 12, fig. 12 is a schematic diagram illustrating a hardware structure of an adjusting apparatus of a visual welding robot according to an exemplary embodiment.
It should be noted that the device is only an example adapted to the application and should not be considered as providing any limitation to the scope of use of the application. The apparatus cannot be interpreted as requiring one or more components in the tuning apparatus that rely on or must have the exemplary visual welding robot shown in fig. 12.
The hardware structure of the apparatus may be greatly different due to the difference of configuration or performance, as shown in fig. 12, the apparatus includes: a power source 1210, an interface 1230, at least one memory 1250, and at least one Central Processing Unit (CPU) 1270.
The power source 1210 is used to provide operating voltage for each hardware device on the device.
The interface 1230 includes at least one wired or wireless network interface 1231, at least one serial-to-parallel conversion interface 1233, at least one input/output interface 1235, and at least one USB interface 1237, etc. for communicating with external devices.
The memory 1250 can be a read-only memory, a random access memory, a magnetic or optical disk, etc. as a carrier for storing resources, such as an operating system 1251, application programs 1253 or data 1255, etc., which can be stored in a transient or permanent manner. The operating system 1251 is used to manage and control hardware devices and application programs 1253 on the device, so as to implement the computation and processing of the data 1255 by the central processor 1270, which may be Windows server, Mac OS XTM, unix, linux, etc. The application programs 1253 are computer programs that perform at least one particular task on top of the operating system 1251 and may include at least one module, each of which may contain a respective series of computer-readable instructions for the device.
Central processor 1270 may include one or more processors and is arranged to communicate with memory 1250 via a bus for computing and processing data 1255 in memory 1250.
As described above in detail, the tuning apparatus to which the vision welding robot of the present application is applied will perform the tuning method of the vision welding robot as described above by the central processor 1270 reading a series of computer readable instructions stored in the memory 1250.
Furthermore, the present application can also be implemented by hardware circuits or hardware circuits in combination with software instructions, and thus, the implementation of the present application is not limited to any specific hardware circuits, software, or a combination of the two.
In another exemplary embodiment, the present application further provides a computer readable storage medium having a computer program stored thereon, which when executed by a processor, implements the tuning method of the visual welding robot as described above. The computer-readable storage medium may be included in the tuning device of the visual welding robot described in the above-described embodiments, or may be separately present without being assembled into the tuning device of the visual welding robot.
The above description is only a preferred exemplary embodiment of the present application, and is not intended to limit the embodiments of the present application, and those skilled in the art can easily make various changes and modifications according to the main concept and spirit of the present application, so that the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for adjusting a visual welding robot, wherein the visual welding robot is provided with a camera, the method comprises the following steps:
synchronizing a control coordinate system of a control system of the visual welding robot with a base coordinate system of the visual welding robot;
acquiring an image of a to-be-welded part acquired by the camera to extract a weld track in the image of the to-be-welded part and extract characteristic points of the weld track;
obtaining coordinates of the characteristic points of the welding seam track in the robot base coordinate system;
controlling the camera to move along the welding seam track according to the welding seam track and the coordinates of the characteristic points in the robot base coordinate system;
and adjusting the position of the camera of the visual welding robot according to the picture shot by the camera moving along the welding seam track.
2. The method of claim 1, wherein synchronizing the control coordinate system of the control system of the visual welding robot with the base coordinate system of the visual welding robot comprises:
detecting a position signal of the vision welding robot;
if a preset position signal is detected, determining that the visual welding robot returns to a mechanical origin, wherein the mechanical origin is the origin of the robot base coordinate system, and the preset position signal is a position signal when the visual welding robot returns to the mechanical origin;
setting the control coordinate system when it is determined that the visual welding robot returns to the mechanical origin to synchronize the control coordinate system with the robot base coordinate system.
3. The method of claim 1, wherein controlling the camera to move along the weld trajectory according to the weld trajectory and the coordinates of the feature points in the robot base coordinate system comprises:
taking the positions of the characteristic points as the origin of a workpiece coordinate system, and acquiring coordinate data of each point of the welding seam track in the workpiece coordinate system;
and controlling the camera to move along the welding seam track according to the coordinates of the characteristic points in the robot base coordinate system and the coordinate data of each point of the welding seam track in the workpiece coordinate system.
4. The method of claim 1, wherein the visual welding robot has a welding gun, the method further comprising:
controlling the camera to move to shoot the feature points;
when the feature point is positioned at the center of a picture shot by the camera, recording the coordinates of the camera in the robot base coordinate system;
controlling the welding gun to move so that the welding gun moves to a position which is away from the characteristic point by a preset arcing distance;
obtaining coordinates of the welding gun in the robot base coordinate system;
and determining the offset distance between the welding gun and the camera according to the camera and the coordinates of the welding gun in the robot base coordinate system.
5. The method of claim 4, wherein the visual welding robot has a plurality of axes for controlling the welding torch to move, the method further comprising, before the controlling the welding torch to move to a position a preset arc starting distance away from the feature point:
debugging the movement of the welding gun, and controlling a plurality of shafts of the visual welding robot to operate at a first speed limit in a first time period after the start of debugging;
and after the first time length is reached, controlling a plurality of shafts of the vision welding robot to operate at a second speed limit, wherein the second speed limit is greater than the first speed limit.
6. The method of claim 5, further comprising:
controlling the welding gun to perform simulated copying welding according to the welding seam track to obtain the motion track of the welding gun;
comparing the movement track of the welding gun with the welding seam track to obtain a first comparison result;
controlling the welding gun to weld according to the welding seam track, and controlling the camera and the welding gun to keep the welding track of the welding gun welding shot by the offset distance;
obtaining a second comparison result according to the welding track and the welding seam track;
and determining an adjustment result for the adjustment of the visual welding robot according to the first comparison result and the second comparison result, determining that the adjustment result for the visual welding robot is successful if the first comparison result and the second comparison result are both matched, and determining that the adjustment result for the visual welding robot is failed if any one of the first comparison result and the second comparison result is not matched.
7. The method of claim 5, wherein after said commissioning the motion of the welding gun, the method further comprises:
confirming whether the motion of each axis of the vision welding robot is stable or not and confirming whether the motion track of the welding gun is smooth or not;
when the stable movement of each shaft and the smooth movement track of the welding gun are determined, planning a welding process and a welding path of the welding gun according to the welding seam track and the offset distance;
and if the planning of the welding process and the welding path meets preset conditions, determining that the debugging results of the welding process and the welding path of the visual welding robot are successful.
8. The method according to claim 1, wherein before said controlling said camera to capture an image of a part to be welded, said method further comprises:
shooting a plurality of images of the to-be-welded parts;
and debugging the camera according to the image information of the images of the to-be-welded parts, wherein the debugging of the camera comprises one or more of determining exposure time, determining sampling frequency, selecting an image processing strategy and selecting a feature extraction strategy.
9. The method according to claim 8, wherein before the capturing the plurality of images of the to-be-welded part, the method further comprises:
confirming whether the robot body of the visual welding robot is deformed or not, and confirming whether an electrical control installation connecting line of the visual welding robot is correct or not and whether the electrical control installation connecting line is loosened or not;
and if determining that the robot body is not deformed and the electrical control installation connecting line is correct and is not loosened, shooting a plurality of images of the to-be-welded parts.
10. A visual welding robot, comprising:
a robot body; the robot body has a plurality of axes;
the camera mechanism is provided with a camera and is connected with the robot body;
the welding equipment is connected with the robot body and comprises a welding gun;
a controller connected to the robot body, the camera mechanism and the welding equipment, for controlling the camera mechanism and the welding equipment to perform the steps of the tuning method of the visual welding robot according to any one of claims 1 to 9.
CN202010515894.3A 2020-06-09 2020-06-09 Adjusting method of visual welding robot and visual welding robot Active CN111673749B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010515894.3A CN111673749B (en) 2020-06-09 2020-06-09 Adjusting method of visual welding robot and visual welding robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010515894.3A CN111673749B (en) 2020-06-09 2020-06-09 Adjusting method of visual welding robot and visual welding robot

Publications (2)

Publication Number Publication Date
CN111673749A CN111673749A (en) 2020-09-18
CN111673749B true CN111673749B (en) 2021-06-08

Family

ID=72435614

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010515894.3A Active CN111673749B (en) 2020-06-09 2020-06-09 Adjusting method of visual welding robot and visual welding robot

Country Status (1)

Country Link
CN (1) CN111673749B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114289857B (en) * 2022-01-25 2023-07-07 江西理工大学 Method for autonomously correcting travelling path of stirring head of friction stir welding equipment
CN115042181B (en) * 2022-06-30 2023-04-11 中船黄埔文冲船舶有限公司 Multi-welding track generation method and system for intermediate assembly segmented robot

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5572102A (en) * 1995-02-28 1996-11-05 Budd Canada Inc. Method and apparatus for vision control of welding robots
JP6914067B2 (en) * 2017-03-21 2021-08-04 株式会社神戸製鋼所 Motion program correction method and welding robot system
US11179793B2 (en) * 2017-09-12 2021-11-23 Autodesk, Inc. Automated edge welding based on edge recognition using separate positioning and welding robots
US11065707B2 (en) * 2017-11-29 2021-07-20 Lincoln Global, Inc. Systems and methods supporting predictive and preventative maintenance
CN107876970B (en) * 2017-12-13 2020-01-10 浙江工业大学 Robot multilayer multi-pass welding seam three-dimensional detection and welding seam inflection point identification method
JP6904927B2 (en) * 2018-07-30 2021-07-21 ファナック株式会社 Robot system and calibration method
CN110539109B (en) * 2019-08-28 2024-04-09 广东工业大学 Robot automatic welding system and method based on single-binocular vision
CN110524580B (en) * 2019-09-16 2023-06-02 西安中科光电精密工程有限公司 Welding robot vision assembly and measuring method thereof
CN111055054B (en) * 2020-01-13 2021-11-16 北京博清科技有限公司 Welding seam identification method and device, welding robot and storage medium

Also Published As

Publication number Publication date
CN111673749A (en) 2020-09-18

Similar Documents

Publication Publication Date Title
CN111673749B (en) Adjusting method of visual welding robot and visual welding robot
JP3092809B2 (en) Inspection method and inspection apparatus having automatic creation function of inspection program data
CN110719696B (en) PCB solder-resisting windowing method and PCB laser windowing machine
CN110770989B (en) Unmanned and maintainable switchgear or control device system and method for operating same
US20070010969A1 (en) Pick and place machine with improved setup and operation procedure
CN105359640B (en) Mounting apparatus and mounting method
CN107271886B (en) Rapid alignment method of flying probe testing machine
US20200367396A1 (en) Inspection apparatus and component mounting system having the same
CN111386024B (en) Pin self-adaptive positioning insertion method and system for double-pin electronic component
CN104741739A (en) Position correcting system of welding robot
CN107571290B (en) Calibration device, method and system for industrial robot end effector
JP2017152651A (en) Component inspection device and component mounting device
CN107764210B (en) Pitch ear auricle assembly pin hole method for measuring coaxiality
CN104128709A (en) Automatic laser spot welding system and method based on vision-aided positioning
CN112238453B (en) Vision-guided robot arm correction method
KR101535801B1 (en) Process inspection device, method and system for assembling process in product manufacturing using depth map sensors
JPH11214899A (en) Method and apparatus for measuring positions of serial contact pins and positioning them on printed circuit board
CN110788439A (en) Manipulator soldering machine motion control system
US10875186B2 (en) Robot system
CN111993420A (en) Fixed binocular vision 3D guide piece feeding system
TWI537556B (en) Printed circuit board assembly detection system and the detection method thereof
KR101218572B1 (en) Automatic connecting apparatus for testing plane display device
CN205129209U (en) Automatically, seek mark welding system based on robot welding
CN114555271A (en) Correction system, correction method, robot system, and control device
KR20060077598A (en) A calibration equipment and method of laser vision system using 6-axis robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230421

Address after: 518000 Shenzhen national engineering laboratory building b1001-b1004, No. 20, Gaoxin South seventh Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong

Patentee after: SHENZHEN CIMC SECURITY AND SMART TECHNOLOGY Co.,Ltd.

Address before: Room 102, Block A, Phase II, Science and Technology Building, 1057 Nanhai Avenue, Shekou, Nanshan District, Shenzhen City, Guangdong Province, 518000

Patentee before: SHENZHEN CIMC SECURITY AND SMART TECHNOLOGY Co.,Ltd.

Patentee before: CHINA INTERNATIONAL MARINE CONTAINERS (GROUP) Ltd.