CN111830060A - White car body welding spot 3D calibration method, system and medium based on template matching - Google Patents

White car body welding spot 3D calibration method, system and medium based on template matching Download PDF

Info

Publication number
CN111830060A
CN111830060A CN202010678088.8A CN202010678088A CN111830060A CN 111830060 A CN111830060 A CN 111830060A CN 202010678088 A CN202010678088 A CN 202010678088A CN 111830060 A CN111830060 A CN 111830060A
Authority
CN
China
Prior art keywords
welding spot
robot
calibration
welding
template matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010678088.8A
Other languages
Chinese (zh)
Inventor
杨华
周江奇
郑宏良
何道聪
何智成
何仝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SAIC GM Wuling Automobile Co Ltd
Original Assignee
SAIC GM Wuling Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SAIC GM Wuling Automobile Co Ltd filed Critical SAIC GM Wuling Automobile Co Ltd
Priority to CN202010678088.8A priority Critical patent/CN111830060A/en
Publication of CN111830060A publication Critical patent/CN111830060A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9515Objects of complex shape, e.g. examined with use of a surface follower device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Abstract

The invention discloses a 3D calibration method, a system and a storage medium for welding spots of a body-in-white based on template matching, wherein the method comprises the following steps: dividing welding spots through the welding spot detection and division model, and converting the centers of the divided welding spots into pixel coordinates; calibrating a camera by using a calibration method, and correcting the camera by using a calibration result to enable a camera plane to be parallel to a robot coordinate plane; adjusting the robot to enable the camera to image clearly and the center of the camera is approximately the center of a welding spot, recording the photographing position of the robot and the pixel position of the welding spot, and cutting the welding spot out by using a rectangular frame to serve as a template; moving the robot by taking the photographing position as an original point on a camera photographing plane, photographing and imaging, performing template matching, and calculating the calibration deviation of the welding spot; and transmitting the calibration deviation of the welding spot to the robot, and calibrating the detection welding spot of the probe. The invention ensures the positioning precision of the quality detection of the welding spot and has good robustness; the welding spot can not be excessively extruded when the probe detects the welding spot, so that the water film is broken.

Description

White car body welding spot 3D calibration method, system and medium based on template matching
Technical Field
The invention relates to the technical field of vehicle body welding spot quality detection, in particular to a white vehicle body welding spot 3D calibration method and system based on template matching and a storage medium.
Background
At present, the quality of welding spots of a white car body is detected by each large and whole car factory only by manual sampling inspection. The ideal signal can be obtained only by aligning a handheld detection probe (an ultrasonic probe or a vortex flux leakage probe) to the position of a welding point, and the main factor influencing the realization of the welding point detection automation is that the positioning precision of the detection probe cannot be ensured. The main defects of manual detection or the existing positioning technology on the detection of welding points of a white automobile body are as follows:
1) the complex curved surface of the vehicle body is more, and the existing two-dimensional positioning method cannot meet the positioning requirement of the probe; although the existing ultrasonic probe can sample ultrasonic signals, the acquired ultrasonic unsteady signals are easily interfered by the outside;
2) the existing hand-eye calibration technology usually uses an optical chessboard glass plate to calibrate hands and eyes, but welding spots are complicated to distribute on a vehicle body and are usually distributed at the intersection of the vehicle body plate or even some key welding spots are distributed on a large-amplitude curved surface, so that a calibration plate is difficult to place to calibrate hands and eyes;
3) in the existing binocular ranging, when a base line B (the distance between two cameras) is 0.8-2.2 times of an object distance z, the measurement error of a vision system is low, B is less than 0.5z or B is more than 3z, and the measurement error is high. The space in the production environment of a workshop is narrow, when a binocular camera is placed, the base line B is far smaller than z, the measurement error is large, and the distance measurement requirement is difficult to meet.
4) The traditional image processing technology is difficult to process pictures under light pollution and identify welding spots.
When the ultrasonic flaw detector detects the welding spot of the automobile in a manual mode, the ultrasonic probe is required to be held by hand to butt the welding spot position and ensure that the welding spot position is vertical to the surface of the welding spot,
therefore, how to extract and effectively realize the positioning detection of the welding spots becomes a big problem of automatic detection.
Disclosure of Invention
The invention mainly aims to provide a 3D calibration method, a system and a storage medium for welding spots of a white car body based on template matching, so as to solve the problems of plane calibration of a complex car body, high-precision distance measurement in the depth direction and image processing under light pollution and improve the accuracy and precision of positioning detection of the welding spots.
In order to achieve the aim, the invention provides a white vehicle body welding spot 3D calibration method based on template matching, which comprises the following steps:
welding spots are segmented through a pre-trained welding spot detection and segmentation model, and the centers of the segmented welding spots are converted into pixel coordinates so as to identify and position the centers of the welding spots;
calibrating a camera by using a preset calibration method, and correcting the camera by using a calibration result to enable a camera plane to be parallel to a robot coordinate plane;
adjusting the robot to enable the camera to image clearly and the center of the camera to be approximately the center of a welding spot, recording the photographing position of the robot and the pixel position of the welding spot, and cutting the welding spot out by using a rectangular frame to serve as a template;
moving the robot by taking the photographing position as an original point on a camera photographing plane, photographing and imaging, performing template matching, and calculating a welding spot calibration deviation based on a template matching result;
and transmitting the welding spot calibration deviation to the robot, and calibrating the welding spot detected by the probe.
The method comprises the following steps of shooting a plane on a camera, taking a shooting position as an original point to move the robot, shooting and imaging, carrying out template matching, and calculating the calibration deviation of a welding spot based on a template matching result, wherein the step comprises the following steps:
moving the robot by taking the photographing position as an original point on a camera photographing plane, photographing and imaging, performing template matching, and recording the robot position and a template matching result at the moment;
moving the robot to other different positions to repeat the above operations; and calculating the conversion relation between the plane of the welding spot and the plane corresponding to the robot to obtain the deviation of the calibration plane of the welding spot.
The method comprises the following steps of shooting a plane on a camera, taking the shooting position as an original point to move the robot, shooting and imaging, carrying out template matching, and calculating the calibration deviation of a welding spot based on a template matching result, wherein the steps of:
and moving the robot without changing the angular posture of the robot probe, guiding laser to align to the center of a welding spot of the vehicle body through the plane deviation, enabling the center of the welding spot to be within the range of the laser range finder, using the laser range finder to perform normal ranging of the probe and the welding spot, and analyzing the coordinate axis of the robot corresponding to the laser range finding direction and the positive-negative conversion relation between the coordinate axis and the coordinate axis to obtain the calibration depth deviation of the welding spot.
The welding spot calibration deviation is transmitted to the robot, and the step of calibrating the welding spot detected by the probe comprises the following steps:
transmitting the calibration deviation of the welding spot to the robot, aligning the tail end of the probe to the center of the welding spot by the mobile robot without changing the angular posture of the probe of the robot, and recording the coordinate of the robot at the moment as the calibration position of the current detection welding spot;
and combining the calibration position with the plane deviation and the depth deviation, guiding the robot to move, enabling the probe to reach a corresponding welding point on the vehicle body, and then detecting.
Wherein the preset calibration method is a calibration method with correction.
The method comprises the following steps of segmenting a welding spot through a pre-trained welding spot detection and segmentation model, converting the center of the segmented welding spot into pixel coordinates, and identifying and positioning the welding spot, wherein the method also comprises the following steps of:
and (3) training a welding spot detection and segmentation model by using mass data and a deep learning YOLO3 framework.
The method comprises the following steps of using a preset calibration method to calibrate the camera, and using a calibration result to correct the camera so that a camera plane is parallel to a robot coordinate plane, wherein the method also comprises the following steps:
roughly planning a robot detection track, comprising: photographing position, laser position and welding spot detection position.
The invention also provides a body-in-white welding spot 3D calibration device based on template matching, which comprises a memory and a processor, wherein the memory is stored with a body-in-white welding spot 3D calibration program based on template matching, and the steps of the method are realized when the body-in-white welding spot 3D calibration program based on template matching is run by the processor.
The invention also provides a computer readable storage medium, on which a template matching based body-in-white welding spot 3D calibration program is stored, which when executed by a processor implements the steps of the method as described above.
The invention has the beneficial effects that: the invention provides a 3D calibration method in automatic detection of white body welding spot quality, which adopts a method of template matching machine mobile calibration and laser ranging to realize 3D hand-eye calibration of welding spots on a white body complex curved surface and ensure the positioning precision of detection of the welding spot quality, and solves the problems that the conventional calibration algorithm is difficult to realize and the precision is low under a complex vehicle body environment through an active hand-eye calibration method of template matching and transmission transformation; by the welding spot positioning method combining robot teaching, laser ranging and two-dimensional visual positioning, the high-precision requirement on 3D positioning of the probe in the automatic detection process is met; the welding spot positioning technology based on deep learning can identify welding spots imaged under different luminous fluxes and welding spot centers, and robustness is good. The calibration technology is improved to realize the calibration of the plate-off hand-eye of the target welding spot in various environments (plane, curved surface and corner); the three-dimensional calibration technology is realized on the basis of two-dimensional plane calibration, and the welding spot is not excessively extruded when the probe detects the welding spot, so that a water film is prevented from being broken.
Drawings
FIG. 1 is a schematic flow chart of a 3D calibration method of a welding spot of a body-in-white based on template matching according to the present invention;
FIG. 2 is a diagram illustrating the processing effect of the deep learning model according to the present invention;
FIG. 3 is a plan calibration interface diagram of the present invention;
FIG. 4 is a laser calibration interface diagram of the present invention;
FIG. 5 is a schematic illustration of a laser processing flow during operation of the present invention;
FIG. 6 is a diagram of laser ranging error analysis in accordance with the present invention.
In order to make the technical solution of the present invention clearer and clearer, the following detailed description is made with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in FIG. 1, the invention provides a 3D calibration method of body-in-white welding spots based on template matching, which comprises the following steps:
s1, welding points are segmented through the pre-trained welding point detection and segmentation model, and the centers of the segmented welding points are converted into pixel coordinates so as to identify and position the centers of the welding points;
s2, calibrating the camera by using a preset calibration method, and correcting the camera by using a calibration result to enable the camera plane to be parallel to the robot coordinate plane;
s3, adjusting the robot to enable the camera to image clearly and the center to be approximately the center of the welding spot, recording the photographing position of the robot and the pixel position of the welding spot, and cutting the welding spot out by a rectangular frame to be used as a template;
s4, moving the robot by taking the photographing position as an original point on a camera photographing plane, photographing and imaging, performing template matching, and calculating the calibration deviation of the welding spot based on the template matching result;
and S5, transmitting the welding spot calibration deviation to the robot, and calibrating the welding spot detected by the probe.
The method comprises the following steps of shooting a plane on a camera, taking a shooting position as an original point to move the robot, shooting and imaging, carrying out template matching, and calculating the calibration deviation of a welding spot based on a template matching result, wherein the step comprises the following steps:
moving the robot by taking the photographing position as an original point on a camera photographing plane, photographing and imaging, performing template matching, and recording the robot position and a template matching result at the moment;
moving the robot to other different positions to repeat the above operations; and calculating the conversion relation between the plane of the welding spot and the plane corresponding to the robot to obtain the deviation of the calibration plane of the welding spot.
The method comprises the following steps of shooting a plane on a camera, taking the shooting position as an original point to move the robot, shooting and imaging, carrying out template matching, and calculating the calibration deviation of a welding spot based on a template matching result, wherein the steps of:
and moving the robot without changing the angular posture of the robot probe, guiding laser to align to the center of a welding spot of the vehicle body through the plane deviation, enabling the center of the welding spot to be within the range of the laser range finder, using the laser range finder to perform normal ranging of the probe and the welding spot, and analyzing the coordinate axis of the robot corresponding to the laser range finding direction and the positive-negative conversion relation between the coordinate axis and the coordinate axis to obtain the calibration depth deviation of the welding spot.
The welding spot calibration deviation is transmitted to the robot, and the step of calibrating the welding spot detected by the probe comprises the following steps:
transmitting the calibration deviation of the welding spot to the robot, aligning the tail end of the probe to the center of the welding spot by the mobile robot without changing the angular posture of the probe of the robot, and recording the coordinate of the robot at the moment as the calibration position of the current detection welding spot;
and combining the calibration position with the plane deviation and the depth deviation, guiding the robot to move, enabling the probe to reach a corresponding welding point on the vehicle body, and then detecting.
Wherein the preset calibration method is a calibration method with correction.
The method comprises the following steps of segmenting a welding spot through a pre-trained welding spot detection and segmentation model, converting the center of the segmented welding spot into pixel coordinates, and identifying and positioning the welding spot, wherein the method also comprises the following steps of:
and (3) training a welding spot detection and segmentation model by using mass data and a deep learning YOLO3 framework.
The method comprises the following steps of using a preset calibration method to calibrate the camera, and using a calibration result to correct the camera so that a camera plane is parallel to a robot coordinate plane, wherein the method also comprises the following steps:
roughly planning a robot detection track, comprising: photographing position, laser position and welding spot detection position.
Compared with the prior art, the invention has the following beneficial effects:
1. the calibration technology is improved to realize the calibration of the plate-off hand-eye of the target welding spot in various environments (plane, curved surface and corner);
2. the three-dimensional calibration technology is realized on the basis of two-dimensional plane calibration, so that the welding spot is not excessively extruded when the probe detects the welding spot, and a water film is prevented from being broken;
3. the welding spot positioning technology based on deep learning can identify the welding spots imaged under different luminous fluxes and the welding spot centers, and the robustness is very good.
The scheme of the invention is explained in detail below:
the invention provides a 3D calibration method in automatic detection of welding spot quality of a body-in-white. By adopting a method of 'template matching machine mobile calibration + laser ranging', 3D hand-eye calibration of welding spots on a complex curved surface of a white automobile body is realized, and the positioning precision of detection of the quality of the welding spots is ensured. The general calibration process is as follows:
pretreatment: and (3) training a welding spot detection and segmentation model by using mass data and a deep learning YOLO3 framework, and converting the segmented welding spot center into pixel coordinates so as to identify and position the welding spot center by a subsequent welding spot.
And calibrating and correcting the camera, calibrating the camera by using a stretched calibration method, and correcting the camera by using the calibration result so that the plane of the camera is approximately parallel to the plane (yoz, xoy, xoz) in the robot coordinate system (depending on the parallelism of the vehicle body and the corresponding plane of the robot).
The method comprises the steps of generating a template and recording the photographing position of the robot, adjusting the robot to ensure that a camera is clear in imaging and the center is approximately the center of a welding spot, recording the photographing position at the robot end, recording the pixel position of the welding spot at the industrial personal computer end, and cutting the welding spot out by using a rectangular frame to serve as the template.
Moving the robot on a camera shooting plane by taking a shooting position as an origin, shooting and imaging, carrying out template matching, and recording the robot position and a template matching result (pixel coordinates of a new image at the upper left corner point of the rectangular frame template) at the moment; moving the robot to other different positions to repeat the above operations; and calculating the conversion relation between the welding spot plane and the plane corresponding to the robot.
When in detection operation, compared with the calibration, the welding point of the car body has deviation in the plane direction and also has deviation in the depth direction, and if only the plane deviation is calculated and the depth deviation is not processed, the probe is possibly pressed by the welding point during detection to damage the probe or separate from the welding point so that correct detection cannot be performed. Therefore, it is necessary to add a depth scaling at the scaling time: the method is characterized in that the angular posture of the tail end (namely a probe) of a robot tool is not changed, a mobile robot enables laser to be aligned to the center of a welding spot, the center of the welding spot is located in the range of a laser range finder, the laser range finder is used for measuring the distance between the probe and the welding spot in the normal direction, the coordinate axis of the robot corresponding to the laser range finding direction and the positive-negative conversion relation between the coordinate axis and the coordinate axis are analyzed, so that the plane deviation obtained through image processing during operation guides the laser to be aligned to the center of the corresponding welding spot of a new vehicle body, then the deviation is transmitted to the robot through re-ranging and calibration distance comparison, the welding spot can be accurately contacted when the welding spot is aligned to the probe, and the probe is.
Calibrating the detection welding points of the probe: the angular posture of the tail end (namely the probe) of the robot tool is not changed, the mobile robot aligns the tail end of the probe to the center of the welding point, and the coordinate of the robot at the moment is recorded as the calibration position of the currently detected welding point. When the robot runs, the calibration position is combined with the plane deviation and the depth deviation to guide the robot to move, and the probe reaches a corresponding welding point on a new vehicle body and then is detected.
Compared with the prior art, the invention has the following beneficial effects:
1. the calibration technology is improved to realize the calibration of the plate-off hand-eye of the target welding spot in various environments (plane, curved surface and corner);
2. the three-dimensional calibration technology is realized on the basis of two-dimensional plane calibration, so that the welding spot is not excessively extruded when the probe detects the welding spot, and a water film is prevented from being broken;
3. the welding spot positioning technology based on deep learning can identify the welding spots imaged under different luminous fluxes and the welding spot centers, and the robustness is very good.
The present invention will be described in further detail with reference to the accompanying drawings.
The invention discloses a high-precision 3D calibration method for automatic detection of welding spots of a body-in-white, which comprises the following modules: the device comprises a sensor module, a data processing module and an actuating mechanism module. Wherein:
the sensor module comprises a laser range finder, a two-dimensional visual guidance system (an industrial camera, a lens and a light source) and a detection probe;
a data processing module: the system comprises an industrial personal computer and a display;
the actuator module includes: industrial robot switch board, industrial robot body, demonstrator, end effector.
The sensor module is responsible for acquiring depth information and plane direction information of the welding spot, and the data processing module is responsible for processing and transmitting information returned by the sensor to the execution mechanism and simultaneously carrying out final welding spot quality evaluation. The executing mechanism is responsible for executing track planning, equipping a sensor system, positioning corner points and carrying a probe to detect welding points.
Correspondingly, the invention provides an operation process of the high-precision 3D calibration method. The operation steps are as follows:
pretreatment: the deep learning method and a large number of welding point pictures are used for training the welding point pictures, welding points are identified, the centers of the welding points are positioned, the effect is shown in figure 2, and figure 2 is the processing effect of the deep learning model.
1. Roughly planning a robot detection track, which mainly comprises a photographing position, a laser position and a welding spot detection position;
2. calibrating and correcting a camera, preferably calibrating the camera by using a matlab tool box to obtain internal parameters, photographing at a detection position, extracting at least 4 feature points, positioning corresponding feature points by using a robot, calculating by using a camera calibration result to obtain a rotation angle of the camera and 3 coordinate axes of a robot coordinate system, namely an Euler angle (alpha, beta, gamma) of the camera under the robot coordinate system, reversely rotating the robot by a corresponding angle, and correcting the camera to enable the plane of the camera to be approximately parallel to a plane (yoz, xoy, xoz) under the robot coordinate system (depending on the parallelism of a vehicle body and the corresponding plane of the robot).
3. The method comprises the steps of generating a template and recording the photographing position of the robot, after the approximate track detected by the robot is planned, adjusting the robot to ensure that the camera is clear in imaging and the center of the camera is approximate to the center of a welding spot at the number 1 photographing position, recording the photographing position at the robot end as the accurate photographing position, recording the pixel position of the center of the welding spot at the industrial personal computer end, and cutting out the welding spot by using a rectangular frame to serve as the template.
4. Moving the robot (+3 and +3) by taking the photographing position as an origin on a camera photographing plane, photographing and imaging, performing template matching, and recording the robot position and a template matching result (pixel coordinates of a left upper corner point of a rectangular frame template in a new image); as shown in fig. 3, fig. 3 is a planar calibration interface diagram.
5. And (4) moving the robot (+3, -3), (-3, +3), (-3, -3) by taking the photographing position as the origin, and repeating the operation of the step (4) respectively.
6. Acquiring and recording a transmission transformation matrix according to the data, and accurately mapping the offset of the welding spot plane to the corresponding world coordinate plane of the robot through transmission transformation, wherein in actual operation, the calculated result needs to be negated because the imaging change direction of a camera is opposite to the moving direction of the robot;
7. returning to the photographing position, without changing the angle, moving the robot to a laser ranging range, aligning laser to a welding spot, recording the laser ranging depth, and recording the coordinate axis of the robot corresponding to the laser ranging direction and the positive-negative conversion relation between the coordinate axis and the coordinate axis;
as shown in fig. 4, 5 and 6, fig. 4 is a laser calibration interface diagram of the present invention; FIG. 5 is a schematic illustration of a laser processing flow during operation of the present invention; FIG. 6 is a diagram of laser ranging error analysis in accordance with the present invention.
The precision of the laser used in the project of the embodiment of the invention is 0.02mm, when in calibration, the project can be calibrated and corrected by naked eyes and a camera to enable the camera imaging plane to be parallel to the plane of the vehicle body, and the laser and the camera are fixed on the same plane, so that the angle deviation of the laser emission plane and the plane of the vehicle body cannot exceed 10 degrees. Let x denote the vertical distance of the laser emission point to the vehicle body plane, d0Indicating the size of the path after laser emission to the vehicle body, d1The size of a path of a light spot reflected from a vehicle body to a laser receiving plane is shown, and theta represents an included angle between a laser emitting plane and the plane of the vehicle body.
Theoretically, the method comprises the following steps:
d0*cosθ=x (1)
d1*cosθ=x+x*sinθ*tanθ+d1*sinθ*tanθ (2)
d=(d0+d1)/2 (3)
d0=d1*cos2θ (4)
the measuring distance of the project is usually 100-130 mm, and the read laser ranging deviation of two-time ranging is as follows:
|Δx|=|d'-d|≤2mm;
the error in the depth direction is:
the (d '-d) - (x' -x) | +0.02 is less than or equal to 0.23mm (theta is less than or equal to 0 and less than or equal to 10), and the precision requirement in the depth direction can be met.
9. And moving the probe to enable the probe to be close to the welding point, observing the detected waveform, and recording the best position of the waveform as a probe calibration point by the robot.
10. And due to the fact that the welding spot detection accuracy is high, 3-9 calibration operations are repeated for each welding spot needing to be detected.
The invention further provides a body-in-white welding spot 3D calibration system based on template matching, which comprises a memory and a processor, wherein the memory is stored with a body-in-white welding spot 3D calibration program based on template matching, and the steps of the method are realized when the body-in-white welding spot 3D calibration program based on template matching is operated by the processor, which is not described herein again.
The invention further provides a computer-readable storage medium, wherein the computer-readable storage medium stores a body-in-white welding spot 3D calibration program based on template matching, and the steps of the method are implemented when the body-in-white welding spot 3D calibration program based on template matching is executed by a processor, which are not described herein again.
The invention has the beneficial effects that: the invention provides a 3D calibration method in automatic detection of white body welding spot quality, which adopts a method of template matching machine mobile calibration and laser ranging to realize 3D hand-eye calibration of welding spots on a white body complex curved surface and ensure the positioning precision of detection of the welding spot quality, and solves the problems that the conventional calibration algorithm is difficult to realize and the precision is low under a complex vehicle body environment through an active hand-eye calibration method of template matching and transmission transformation; by the welding spot positioning method combining robot teaching, laser ranging and two-dimensional visual positioning, the high-precision requirement on 3D positioning of the probe in the automatic detection process is met; the welding spot positioning technology based on deep learning can identify welding spots imaged under different luminous fluxes and welding spot centers, and robustness is good. The calibration technology is improved to realize the calibration of the plate-off hand-eye of the target welding spot in various environments (plane, curved surface and corner); the three-dimensional calibration technology is realized on the basis of two-dimensional plane calibration, and the welding spot is not excessively extruded when the probe detects the welding spot, so that a water film is prevented from being broken.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (9)

1. A white car body welding spot 3D calibration method based on template matching is characterized by comprising the following steps:
welding spots are segmented through a pre-trained welding spot detection and segmentation model, and the centers of the segmented welding spots are converted into pixel coordinates so as to identify and position the centers of the welding spots;
calibrating a camera by using a preset calibration method, and correcting the camera by using a calibration result to enable a camera plane to be parallel to a robot coordinate plane;
adjusting the robot to enable the camera to image clearly and the center of the camera to be approximately the center of a welding spot, recording the photographing position of the robot and the pixel position of the welding spot, and cutting the welding spot out by using a rectangular frame to serve as a template;
moving the robot by taking the photographing position as an original point on a camera photographing plane, photographing and imaging, performing template matching, and calculating a welding spot calibration deviation based on a template matching result;
and transmitting the welding spot calibration deviation to the robot, and calibrating the welding spot detected by the probe.
2. The 3D calibration method for welding points of a white body based on template matching as claimed in claim 1, wherein the step of moving the robot with the photographing position as an origin on a camera photographing plane, photographing for imaging, performing template matching, and calculating the calibration deviation of the welding points based on the template matching result comprises:
moving the robot by taking the photographing position as an original point on a camera photographing plane, photographing and imaging, performing template matching, and recording the robot position and a template matching result at the moment;
moving the robot to other different positions to repeat the above operations; and calculating the conversion relation between the plane of the welding spot and the plane corresponding to the robot to obtain the deviation of the calibration plane of the welding spot.
3. The 3D calibration method for welding points of a white body based on template matching as claimed in claim 2, wherein the step of moving the robot with the photographing position as the origin on the camera photographing plane, photographing for imaging, performing template matching, and calculating the calibration deviation of the welding points based on the template matching result further comprises:
and moving the robot without changing the angular posture of the robot probe, guiding laser to align to the center of a welding spot of the vehicle body through the plane deviation, enabling the center of the welding spot to be within the range of the laser range finder, using the laser range finder to perform normal ranging of the probe and the welding spot, and analyzing the coordinate axis of the robot corresponding to the laser range finding direction and the positive-negative conversion relation between the coordinate axis and the coordinate axis to obtain the calibration depth deviation of the welding spot.
4. The template matching-based 3D calibration method for welding points of a body in white according to claim 3, wherein the step of transferring the welding point calibration deviation to a robot and calibrating the welding points detected by a probe comprises the following steps:
transmitting the calibration deviation of the welding spot to the robot, aligning the tail end of the probe to the center of the welding spot by the mobile robot without changing the angular posture of the probe of the robot, and recording the coordinate of the robot at the moment as the calibration position of the current detection welding spot;
and combining the calibration position with the plane deviation and the depth deviation, guiding the robot to move, enabling the probe to reach a corresponding welding point on the vehicle body, and then detecting.
5. The template matching-based 3D calibration method for welding points of a body-in-white welding spot is characterized in that the preset calibration method is a tensioned calibration method.
6. The template matching-based 3D calibration method for welding spots of body-in-white according to claim 1, wherein the step of segmenting welding spots by a pre-trained welding spot detection and segmentation model and converting the centers of the segmented welding spots into pixel coordinates for welding spot identification and positioning of welding spot centers further comprises:
and (3) training a welding spot detection and segmentation model by using mass data and a deep learning YOLO3 framework.
7. The template matching-based 3D calibration method for welding points of a body-in-white according to claim 1, wherein the calibration of the camera is performed by using a preset calibration method, and the step of performing camera correction by using the calibration result so that the camera plane is parallel to the robot coordinate plane further comprises the following steps:
roughly planning a robot detection track, comprising: photographing position, laser position and welding spot detection position.
8. A body-in-white welding spot 3D calibration system based on template matching is characterized by comprising a memory and a processor, wherein the memory is stored with a body-in-white welding spot 3D calibration program based on template matching, and the body-in-white welding spot 3D calibration program based on template matching is executed by the processor to realize the steps of the method according to any one of claims 1-7.
9. A computer readable storage medium having stored thereon a template matching based body-in-white weld 3D calibration program, which when executed by a processor implements the steps of the method according to any one of claims 1 to 7.
CN202010678088.8A 2020-07-15 2020-07-15 White car body welding spot 3D calibration method, system and medium based on template matching Pending CN111830060A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010678088.8A CN111830060A (en) 2020-07-15 2020-07-15 White car body welding spot 3D calibration method, system and medium based on template matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010678088.8A CN111830060A (en) 2020-07-15 2020-07-15 White car body welding spot 3D calibration method, system and medium based on template matching

Publications (1)

Publication Number Publication Date
CN111830060A true CN111830060A (en) 2020-10-27

Family

ID=72924006

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010678088.8A Pending CN111830060A (en) 2020-07-15 2020-07-15 White car body welding spot 3D calibration method, system and medium based on template matching

Country Status (1)

Country Link
CN (1) CN111830060A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112750102A (en) * 2020-12-16 2021-05-04 华南理工大学 Welding spot positioning method and system based on image processing
CN113664323A (en) * 2021-07-23 2021-11-19 深圳市兆兴博拓科技股份有限公司 Automatic welding instrument control method, device, equipment and storage medium
CN117102725A (en) * 2023-10-25 2023-11-24 湖南大学 Welding method and system for steel-concrete combined structure connecting piece

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100114338A1 (en) * 2008-10-31 2010-05-06 Gm Global Technology Operations, Inc. Multi-goal path planning of welding robots with automatic sequencing
CN102590340A (en) * 2012-02-29 2012-07-18 湖南湖大艾盛汽车技术开发有限公司 Detection equipment for welding spot failure of whole set of white vehicle body
CN104942496A (en) * 2015-06-29 2015-09-30 湖南大学 Car body-in-white welding spot positioning method and device based on robot visual servo
WO2017184205A1 (en) * 2016-04-18 2017-10-26 Ghanem George K System and method for joining workpieces to form an article
CN107876970A (en) * 2017-12-13 2018-04-06 浙江工业大学 A kind of robot multi-pass welding welding seam three-dimensional values and weld seam inflection point identification method
CN108896658A (en) * 2018-05-14 2018-11-27 湖南湖大艾盛汽车技术开发有限公司 A kind of ultrasonic wave automated detection method based on PLC
CN109387569A (en) * 2017-08-11 2019-02-26 上汽通用五菱汽车股份有限公司 A kind of quality of welding spot automatic checkout system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100114338A1 (en) * 2008-10-31 2010-05-06 Gm Global Technology Operations, Inc. Multi-goal path planning of welding robots with automatic sequencing
CN102590340A (en) * 2012-02-29 2012-07-18 湖南湖大艾盛汽车技术开发有限公司 Detection equipment for welding spot failure of whole set of white vehicle body
CN104942496A (en) * 2015-06-29 2015-09-30 湖南大学 Car body-in-white welding spot positioning method and device based on robot visual servo
WO2017184205A1 (en) * 2016-04-18 2017-10-26 Ghanem George K System and method for joining workpieces to form an article
CN109387569A (en) * 2017-08-11 2019-02-26 上汽通用五菱汽车股份有限公司 A kind of quality of welding spot automatic checkout system
CN107876970A (en) * 2017-12-13 2018-04-06 浙江工业大学 A kind of robot multi-pass welding welding seam three-dimensional values and weld seam inflection point identification method
CN108896658A (en) * 2018-05-14 2018-11-27 湖南湖大艾盛汽车技术开发有限公司 A kind of ultrasonic wave automated detection method based on PLC

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112750102A (en) * 2020-12-16 2021-05-04 华南理工大学 Welding spot positioning method and system based on image processing
CN113664323A (en) * 2021-07-23 2021-11-19 深圳市兆兴博拓科技股份有限公司 Automatic welding instrument control method, device, equipment and storage medium
CN117102725A (en) * 2023-10-25 2023-11-24 湖南大学 Welding method and system for steel-concrete combined structure connecting piece
CN117102725B (en) * 2023-10-25 2024-01-09 湖南大学 Welding method and system for steel-concrete combined structure connecting piece

Similar Documents

Publication Publication Date Title
CN111830060A (en) White car body welding spot 3D calibration method, system and medium based on template matching
CN109084681B (en) System and method for calibrating a vision system with respect to a contact probe
JP6280525B2 (en) System and method for runtime determination of camera miscalibration
JP6025386B2 (en) Image measuring apparatus, image measuring method, and image measuring program
US20080252248A1 (en) Device and Method for Calibrating the Center Point of a Tool Mounted on a Robot by Means of a Camera
CN103192386B (en) Image-vision-based automatic calibration method of clean robot
JP3930482B2 (en) 3D visual sensor
JP2005201824A (en) Measuring device
CN111531407B (en) Workpiece attitude rapid measurement method based on image processing
CN110560892B (en) Pipe identification method and device based on laser pipe cutting equipment
CN105547153A (en) Plug-in element visual positioning method and device based on binocular vision
CN112082477A (en) Universal tool microscope three-dimensional measuring device and method based on structured light
TW201538925A (en) Non-contact measurement device and method for object space information and the method thereof for computing the path from capturing the image
CN112505663A (en) Calibration method for multi-line laser radar and camera combined calibration
JPH0762869B2 (en) Position and shape measurement method by pattern projection
CN104034259A (en) Method for correcting image measurement instrument
CN112361958B (en) Line laser and mechanical arm calibration method
CN112958960A (en) Robot hand-eye calibration device based on optical target
CN112529856A (en) Method for determining the position of an operating object, robot and automation system
Jianming et al. Error correction for high-precision measurement of cylindrical objects diameter based on machine vision
CN110849285A (en) Welding spot depth measuring method, system and medium based on monocular camera
US20070271638A1 (en) Auto-teaching system
CN114266822A (en) Workpiece quality inspection method and device based on binocular robot, robot and medium
CN113160326A (en) Hand-eye calibration method and device based on reconstructed coordinate system
CN113733078B (en) Method for interpreting fine control quantity of mechanical arm and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination