CN116100204A - Positioning guide method based on coordinate mapping algorithm - Google Patents

Positioning guide method based on coordinate mapping algorithm Download PDF

Info

Publication number
CN116100204A
CN116100204A CN202211739914.0A CN202211739914A CN116100204A CN 116100204 A CN116100204 A CN 116100204A CN 202211739914 A CN202211739914 A CN 202211739914A CN 116100204 A CN116100204 A CN 116100204A
Authority
CN
China
Prior art keywords
welding
coordinates
station
coordinate
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211739914.0A
Other languages
Chinese (zh)
Inventor
戴文壮
王振宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Yuhui Information Technology Co ltd
Original Assignee
Wuxi Yuhui Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Yuhui Information Technology Co ltd filed Critical Wuxi Yuhui Information Technology Co ltd
Priority to CN202211739914.0A priority Critical patent/CN116100204A/en
Publication of CN116100204A publication Critical patent/CN116100204A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Numerical Control (AREA)

Abstract

The invention discloses a positioning and guiding method based on a coordinate mapping algorithm, which comprises a 3D scanning station, a positioning and guiding device and a positioning and guiding device, wherein the 3D scanning station scans a pole and Mark point image according to steps and outputs coordinates of the pole and the Mark point; the welding station is used for collecting Mark point pictures and welding CCS and pole posts; the invention has simple design and avoids grabbing and placing CCS designed at the welding station. These process flows may be placed in other stations. The station beat is shortened, and the welding station can obtain the polar coordinates for welding only by taking Mark points; the flexibility is enhanced, the front station is not required to be the last station, and the process flow is set, so that the flexibility is enhanced, and the process flow can be used for other requirements of other stations.

Description

Positioning guide method based on coordinate mapping algorithm
Technical Field
The invention belongs to the technical field of welding of battery modules, and particularly relates to a positioning and guiding method based on a coordinate mapping algorithm.
Background
In the process of producing a battery module for a new energy vehicle, battery pack welding is an indispensable part thereof, and in module welding, CCS needs to be welded to a post of the battery module, which requires a vision system to guide a welding system to perform accurate welding.
In the welding process, the CCS above the welding station has masked the center of the pole below. Resulting in a vision system that cannot directly acquire a complete image of the pole at the welding station. If the process is changed, the CCS is removed in the welding station, and the CCS is aligned and placed after the vision system finishes taking the image. This results in a complex mechanism design and an excessively long overall station throughput.
In order to solve the problem, it is required that in a pre-station of a welding station, a vision system acquires image information of a pole and a mark point in advance, so as to obtain overall relative position information of the pole, and then acquires position information of the mark point in the welding station. The polar coordinates are mapped to the welding station.
Disclosure of Invention
The invention aims to provide a positioning and guiding method based on a coordinate mapping algorithm, which aims to solve the guiding and welding problem in the production process of a new energy automobile battery;
in order to achieve the above purpose, the technical scheme provided by the invention specifically comprises the following steps: the positioning and guiding method based on the coordinate mapping algorithm comprises a 3D scanning station, wherein the 3D scanning station scans the pole and Mark point images according to steps and outputs the coordinates of the pole and Mark point; the welding station is used for collecting Mark point pictures and welding CCS and pole posts;
the 3D scanning station uses a 3D line scanning step to acquire images, a visual algorithm is adopted when the images are positioned, position coordinates of a product pole center and four corners Mrak point are obtained, and then coordinate data are stored into corresponding local documents through a product SN so as to be called by a station to be welded;
in the welding station flow, a 2D camera is mounted on a Robot, four Mark point coordinates are sequentially obtained through a visual algorithm, and after film calibration, the Mark point coordinates on a film coordinate system are obtained.
Further, the file stored in the local area of ST01 is called to obtain the pole position coordinate and Mark point coordinate of the 3D scanning.
Further, mapping 4 Mrak point coordinates in 3D and 4 Mrak point coordinate data in 2D to obtain a mapping matrix, and converting the pole position coordinates of 3D scanning to a film coordinate system through matrix conversion; calibrating the CCD and the welding galvanometer by each welding position defined by the Robot; and acquiring the welding origin coordinates to obtain welding origin coordinates of each welding position of the Robot, and finally processing the welding origin coordinates through the converted polar coordinates and each welding origin coordinates to obtain the welding coordinates of each welding position of the Robot.
Further, the steps in the image acquisition process are as follows:
s1: selecting a large calibration plate with a standard; respectively carrying out camera drawing calibration at 4 Mark points; the coordinates of each Mark point under a coordinate system of a calibration plate are obtained through calibration matrixes A1, A2, A3 and A4, and the coordinates are A1, A2, A3 and A4;
s2: the number of robots is N, N is more than 4, four polar posts are required to be welded at each robot welding position through the vibrating mirror, and all robot welding positions are determined;
s3: nine-point calibration is carried out on the welding position of the 1 st robot and the vibrating mirror, so as to obtain a conversion matrix of the image coordinate system of the vibrating mirror and the camera, namely a B1 matrix;
s4: respectively performing low-power welding of the vibrating mirror origin at the welding position of the 1 st robot to the welding position of the N th robot, then photographing by a camera to obtain N welding origin points, and photographing a calibration plate image at each welding position to obtain N calibration matrixes of C1-CN, wherein N is more than 4;
s5: obtaining the coordinate of each welding origin under the coordinate system of the calibration plate through the B1 matrix and the C1-CN matrix;
s6: in a prepositive station of the welding station, the position coordinates of all the polar posts and the coordinates of 4 Mark points are obtained through one 3D line scanning, and the coordinates are b1, b2, b3 and b4;
s7: obtaining a mapping matrix S through 4 Mark point coordinates obtained at a welding station, a1, a2, a3 and a4 and 4 Mark point coordinates obtained at a prepositive station, namely b1, b2, b3 and b4;
s8: all polar coordinates obtained by the pre-station are converted to a calibration plate coordinate system through a mapping matrix S;
s9: obtaining the welding coordinate of each polar column through the converted polar column coordinate and the welding origin coordinate of each robot welding position;
s10: and transmitting the welding coordinates to the vibrating mirror equipment for welding.
Compared with the prior art, the invention has the beneficial effects that:
1. the invention has simple design and avoids grabbing and placing CCS designed at the welding station. These process flows may be placed in other stations.
2. The station beat is shortened, and the welding station can obtain the polar coordinates for welding only by taking Mark points;
3. the flexibility is enhanced, the front station is not required to be the last station, and the process flow is set, so that the flexibility is enhanced, and the process flow can be used for other requirements of other stations.
Drawings
FIG. 1 is a schematic diagram of the system configuration of the present invention;
FIG. 2 is a schematic flow chart of an embodiment of the present invention;
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present invention and should not be construed as limiting the invention.
As shown in fig. 1-2, a positioning guiding method based on a coordinate mapping algorithm comprises a 3D scanning station, which scans a pole and Mark point image according to steps and outputs coordinates of the pole and Mark point; the welding station is used for collecting Mark point pictures and welding CCS and pole posts;
the 3D scanning station uses a 3D line scanning step to acquire images, a visual algorithm is adopted when the images are positioned, position coordinates of a product pole center and four corners Mrak point are obtained, and then coordinate data are stored into corresponding local documents through a product SN so as to be called by a station to be welded;
in the welding station flow, a 2D camera is mounted on a Robot, four Mark point coordinates are sequentially obtained through a visual algorithm, and after film calibration, the Mark point coordinates on a film coordinate system are obtained.
In this embodiment, a file stored locally in ST01 is called to obtain the post position coordinates and Mark point coordinates of the 3D scan.
In the embodiment, mapping 4 Mrak point coordinates in 3D and 4 Mrak point coordinate data in 2D to obtain a mapping matrix, and converting the pole position coordinates of 3D scanning to a film coordinate system through matrix conversion; calibrating the CCD and the welding galvanometer by each welding position defined by the Robot; and acquiring the welding origin coordinates to obtain welding origin coordinates of each welding position of the Robot, and finally processing the welding origin coordinates through the converted polar coordinates and each welding origin coordinates to obtain the welding coordinates of each welding position of the Robot.
The welding station adopts a robot to mount a laser vibrating mirror for welding. And meanwhile, a 2D area array camera is mounted to acquire Mark point images. In the pre-station, a 3D line sweep is adopted for scanning.
In this embodiment, the steps in the image acquisition process are as follows:
s1: selecting a large calibration plate with a standard; respectively carrying out camera drawing calibration at 4 Mark points; the coordinates of each Mark point under a coordinate system of a calibration plate are obtained through calibration matrixes A1, A2, A3 and A4, and the coordinates are A1, A2, A3 and A4;
s2: the number of robots is N, N is more than 4, four polar posts are required to be welded at each robot welding position through the vibrating mirror, and all robot welding positions are determined;
s3: nine-point calibration is carried out on the welding position of the 1 st robot and the vibrating mirror, so as to obtain a conversion matrix of the image coordinate system of the vibrating mirror and the camera, namely a B1 matrix;
s4: respectively performing low-power welding of the vibrating mirror origin at the welding position of the 1 st robot to the welding position of the N th robot, then photographing by a camera to obtain N welding origin points, and photographing a calibration plate image at each welding position to obtain N calibration matrixes of C1-CN, wherein N is more than 4;
s5: obtaining the coordinate of each welding origin under the coordinate system of the calibration plate through the B1 matrix and the C1-CN matrix;
s6: in a prepositive station of the welding station, the position coordinates of all the polar posts and the coordinates of 4 Mark points are obtained through one 3D line scanning, and the coordinates are b1, b2, b3 and b4;
s7: obtaining a mapping matrix S through 4 Mark point coordinates obtained at a welding station, a1, a2, a3 and a4 and 4 Mark point coordinates obtained at a prepositive station, namely b1, b2, b3 and b4;
s8: all polar coordinates obtained by the pre-station are converted to a calibration plate coordinate system through a mapping matrix S;
s9: obtaining the welding coordinate of each polar column through the converted polar column coordinate and the welding origin coordinate of each robot welding position;
s10: and transmitting the welding coordinates to the vibrating mirror equipment for welding.
In this example, the visual algorithm employed is a coordinate map. The coordinate mapping of the image belongs to one of the geometric changes of the image, and the geometric change of the image is carried out on the image under the premise of not changing the pixel value of the image.
The coordinate mapping of the image is to establish a mapping relation between the original image and the target image. There are two types of such mapping relationships: one is to calculate the coordinate position of any pixel of the original image after mapping; and secondly, calculating the coordinate position of the inverse mapping of any pixel of the transformed image. The mapping of the original image to the target image is called positive mapping, whereas the mapping relationship from the target image to the original image is called reverse mapping.
In this example, the coordinate mapping employs a positive mapping.
The following is the transformation relationship of the image matrix.
Image translation matrix transformation relationship:
Figure BDA0004032128020000071
image rotation matrix transformation relationship:
Figure BDA0004032128020000072
image scaling matrix transformation relationship:
Figure BDA0004032128020000073
image miscut matrix transformation relationship:
Figure BDA0004032128020000074
in this example, since the 3D camera coordinate system and the film coordinate system are the same in real space, the entire workpiece is not tilted substantially during movement. So the mapping relationship can be regarded as a two-dimensional rigid body transformation.
The general form of a two-dimensional rigid body transformation matrix is as follows:
Figure BDA0004032128020000081
wherein all angles and distances of the transformed coordinate positions are unchanged. Furthermore, matrix 1 has the characteristic that its upper left 2x2 matrix is an orthogonal matrix. This means that if each row (or each column) of the sub-matrix is taken as a vector, then two row vectors (rxx, rxy) and (ryx, ryy) (or two column vectors) form an orthogonal set of unit vectors. Such a set of vectors is also referred to as a set of orthogonal vectors. Each vector has a unit length of:
Figure BDA0004032128020000082
the two-dimensional rigid transformation matrix can be seen as a homogeneous matrix, and a homogeneous equation set is obtained.
Figure BDA0004032128020000083
Substituting the original coordinate system coordinates and the converted coordinates into a formula to obtain a mapping matrix.
The above embodiments are merely illustrative of the principles of the present invention and its effectiveness, and are not intended to limit the invention. Modifications and variations may be made to the above-described embodiments by those skilled in the art without departing from the spirit and scope of the invention. Accordingly, it is intended that all equivalent modifications and variations of the invention be covered by the claims, which are within the ordinary skill of the art, be within the spirit and scope of the present disclosure.

Claims (4)

1. A positioning guiding method based on a coordinate mapping algorithm is characterized in that: the three-dimensional scanning device comprises a 3D scanning station, a three-dimensional scanning module and a three-dimensional scanning module, wherein the 3D scanning station scans the pole and Mark point images according to steps and outputs coordinates of the pole and Mark points; the welding station is used for collecting Mark point pictures and welding CCS and pole posts;
the 3D scanning station uses a 3D line scanning step to acquire images, a visual algorithm is adopted when the images are positioned, position coordinates of a product pole center and four corners Mrak point are obtained, and then coordinate data are stored into corresponding local documents through a product SN so as to be called by a station to be welded;
in the welding station flow, a 2D camera is mounted on a Robot, four Mark point coordinates are sequentially obtained through a visual algorithm, and after film calibration, the Mark point coordinates on a film coordinate system are obtained.
2. The positioning guiding method based on the coordinate mapping algorithm as claimed in claim 1, wherein: and (5) retrieving a file stored in the ST01 locally to obtain the polar column position coordinate and Mark point coordinate of the 3D scanning.
3. The positioning guiding method based on the coordinate mapping algorithm as claimed in claim 1, wherein: mapping 4 Mrak point coordinates in 3D and 4 Mrak point coordinate data in 2D to obtain a mapping matrix, and converting the pole position coordinates of 3D scanning to a film coordinate system through matrix conversion; calibrating the CCD and the welding galvanometer by each welding position defined by the Robot; and acquiring the welding origin coordinates to obtain welding origin coordinates of each welding position of the Robot, and finally processing the welding origin coordinates through the converted polar coordinates and each welding origin coordinates to obtain the welding coordinates of each welding position of the Robot.
4. The positioning guiding method based on the coordinate mapping algorithm as claimed in claim 1, wherein: the steps in the image acquisition and positioning process are as follows:
s1: selecting a large calibration plate with a standard; respectively carrying out camera drawing calibration at 4 Mark points; the coordinates of each Mark point under a coordinate system of a calibration plate are obtained through calibration matrixes A1, A2, A3 and A4, and the coordinates are A1, A2, A3 and A4;
s2: the number of robots is N, N is more than 4, four polar posts are required to be welded at each robot welding position through the vibrating mirror, and all robot welding positions are determined;
s3: nine-point calibration is carried out on the welding position of the 1 st robot and the vibrating mirror, so as to obtain a conversion matrix of the image coordinate system of the vibrating mirror and the camera, namely a B1 matrix;
s4: respectively performing low-power welding of the vibrating mirror origin at the welding position of the 1 st robot to the welding position of the N th robot, then photographing by a camera to obtain N welding origin points, and photographing a calibration plate image at each welding position to obtain N calibration matrixes of C1-CN, wherein N is more than 4;
s5: obtaining the coordinate of each welding origin under the coordinate system of the calibration plate through the B1 matrix and the C1-CN matrix;
s6: in a prepositive station of the welding station, the position coordinates of all the polar posts and the coordinates of 4 Mark points are obtained through one 3D line scanning, and the coordinates are b1, b2, b3 and b4;
s7: obtaining a mapping matrix S through 4 Mark point coordinates obtained at a welding station, a1, a2, a3 and a4 and 4 Mark point coordinates obtained at a prepositive station, namely b1, b2, b3 and b4;
s8: all polar coordinates obtained by the pre-station are converted to a calibration plate coordinate system through a mapping matrix S;
s9: obtaining the welding coordinate of each polar column through the converted polar column coordinate and the welding origin coordinate of each robot welding position;
s10: and transmitting the welding coordinates to the vibrating mirror equipment for welding.
CN202211739914.0A 2022-12-30 2022-12-30 Positioning guide method based on coordinate mapping algorithm Pending CN116100204A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211739914.0A CN116100204A (en) 2022-12-30 2022-12-30 Positioning guide method based on coordinate mapping algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211739914.0A CN116100204A (en) 2022-12-30 2022-12-30 Positioning guide method based on coordinate mapping algorithm

Publications (1)

Publication Number Publication Date
CN116100204A true CN116100204A (en) 2023-05-12

Family

ID=86263179

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211739914.0A Pending CN116100204A (en) 2022-12-30 2022-12-30 Positioning guide method based on coordinate mapping algorithm

Country Status (1)

Country Link
CN (1) CN116100204A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117020416A (en) * 2023-10-08 2023-11-10 宁德时代新能源科技股份有限公司 Coordinate conversion method and welding system
CN117020413A (en) * 2023-10-08 2023-11-10 宁德时代新能源科技股份有限公司 Polar column coordinate determination method, welding method and welding system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117020416A (en) * 2023-10-08 2023-11-10 宁德时代新能源科技股份有限公司 Coordinate conversion method and welding system
CN117020413A (en) * 2023-10-08 2023-11-10 宁德时代新能源科技股份有限公司 Polar column coordinate determination method, welding method and welding system
CN117020416B (en) * 2023-10-08 2024-02-06 宁德时代新能源科技股份有限公司 Coordinate conversion method and welding system
CN117020413B (en) * 2023-10-08 2024-02-23 宁德时代新能源科技股份有限公司 Polar column coordinate determination method, welding method and welding system

Similar Documents

Publication Publication Date Title
CN116100204A (en) Positioning guide method based on coordinate mapping algorithm
CN111369630A (en) Method for calibrating multi-line laser radar and camera
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN108416812B (en) Calibration method of single-camera mirror image binocular vision system
CN111673735A (en) Mechanical arm control method and device based on monocular vision positioning
CN112132908B (en) Camera external parameter calibration method and device based on intelligent detection technology
CN102042807B (en) Flexible stereoscopic vision measuring unit for target space coordinate
CN109448054A (en) The target Locate step by step method of view-based access control model fusion, application, apparatus and system
CN102692214A (en) Narrow space binocular vision measuring and positioning device and method
CN111524174B (en) Binocular vision three-dimensional construction method for moving platform moving target
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN114571160B (en) Off-line curved surface weld extraction and attitude estimation method
CN112489137A (en) RGBD camera calibration method and system
CN106500625A (en) A kind of telecentricity stereo vision measuring apparatus and its method for being applied to the measurement of object dimensional pattern micron accuracies
CN116026252A (en) Point cloud measurement method and system
CN111707187A (en) Measuring method and system for large part
CN116740187A (en) Multi-camera combined calibration method without overlapping view fields
CN112634379A (en) Three-dimensional positioning measurement method based on mixed vision field light field
CN113724337A (en) Camera dynamic external parameter calibration method and device without depending on holder angle
CN111986267A (en) Coordinate system calibration method of multi-camera vision system
CN113870364B (en) Self-adaptive binocular camera calibration method
CN114037768A (en) Method and device for joint calibration of multiple sets of tracking scanners
CN113920150A (en) Simplified binocular vision mileage positioning method for planet vehicle under resource limitation
CN117830171A (en) High-precision AR-HUD distortion calibration method
CN113436265A (en) Binocular calibration device for thermal infrared imager and visible light camera and use method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication