CN114494424A - Welding guiding method and device based on vision - Google Patents

Welding guiding method and device based on vision Download PDF

Info

Publication number
CN114494424A
CN114494424A CN202210390934.5A CN202210390934A CN114494424A CN 114494424 A CN114494424 A CN 114494424A CN 202210390934 A CN202210390934 A CN 202210390934A CN 114494424 A CN114494424 A CN 114494424A
Authority
CN
China
Prior art keywords
camera
coordinate system
dimensional
preset
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210390934.5A
Other languages
Chinese (zh)
Inventor
谈源
史伟林
罗金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou New Intelligent Technology Co Ltd
Original Assignee
Changzhou New Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou New Intelligent Technology Co Ltd filed Critical Changzhou New Intelligent Technology Co Ltd
Priority to CN202210390934.5A priority Critical patent/CN114494424A/en
Publication of CN114494424A publication Critical patent/CN114494424A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K31/00Processes relevant to this subclass, specially adapted for particular articles or purposes, but not covered by only one of the preceding main groups
    • B23K31/02Processes relevant to this subclass, specially adapted for particular articles or purposes, but not covered by only one of the preceding main groups relating to soldering or welding
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Algebra (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application relates to the technical field of welding processes, and discloses a welding guiding method and device based on vision. According to the scheme provided by the application, a camera coordinate can be obtained when the camera moves in a three-dimensional space, and when the camera moves to a certain position in the three-dimensional space, the two-dimensional pixel coordinate of the target welding point is determined, so that welding guidance can be accurately finished.

Description

Welding guiding method and device based on vision
Technical Field
The application relates to the technical field of welding processes, in particular to a welding guiding method and device based on vision.
Background
Welding, also known as fusion or melting, is a manufacturing process and technique for joining metals or other thermoplastic materials, such as plastics, by means of heat, high temperature or high pressure.
With the progress and scientific development of society, the requirement on the efficiency of welding processing is higher and higher, and the automatic welding production is gradually replacing the traditional manual production, so a method for guiding welding is needed.
Disclosure of Invention
The application discloses a welding guiding method and device based on vision, which are used for solving the technical problem that the prior art is lack of the welding guiding method.
The application discloses in a first aspect a vision-based welding guidance method, comprising:
transforming the three-dimensional coordinates of the camera to a preset reference surface, and determining the corresponding relation between the reference surface and a two-dimensional virtual pixel coordinate system, wherein the Z axis of the reference surface is 0;
acquiring homogeneous coordinates of a camera when the camera shoots a welding image;
determining two-dimensional pixel coordinates of a target welding point in the welding image according to the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system and the homogeneous coordinates of the camera;
and guiding welding according to the two-dimensional pixel coordinates of the target welding point.
Optionally, the reference surface has a matrix with a length of 2cm and a width of 2cm and is arranged at 3 × 3.
Optionally, the determining a corresponding relationship between the reference plane and the two-dimensional virtual pixel coordinate system includes:
and determining the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system according to a preset change coefficient, a preset camera internal reference matrix, a preset rotation matrix and a preset translation vector.
Optionally, the determining, according to a preset change coefficient, a preset camera internal reference matrix, a preset rotation matrix, and a preset translation vector, a corresponding relationship between the reference plane and the two-dimensional virtual pixel coordinate system includes:
determining a correspondence between the reference plane and a two-dimensional virtual pixel coordinate system by the following formula:
Figure 926046DEST_PATH_IMAGE001
wherein,
Figure 542841DEST_PATH_IMAGE002
the coefficient of variation is represented by a coefficient of variation,
Figure 987729DEST_PATH_IMAGE003
representing homogeneous coordinates of two-dimensional camera plane pixels,
Figure 731694DEST_PATH_IMAGE004
a reference matrix within the camera is represented,
Figure 562116DEST_PATH_IMAGE005
a matrix of rotations is represented, which is,
Figure 100545DEST_PATH_IMAGE006
a translation vector is represented that represents the translation vector,
Figure 22276DEST_PATH_IMAGE007
representing the homogeneous coordinates of the camera.
Optionally, the camera internal reference matrix includes camera internalization parameters, and the camera internalization parameters are determined according to the size of the camera chip and the resolution of the image captured by the camera.
In a second aspect of the present application, a vision-based welding guidance apparatus is disclosed, which is applied to the vision-based welding guidance method disclosed in the first aspect of the present application, and includes:
the system comprises a coordinate system changing module, a two-dimensional virtual pixel coordinate system and a three-dimensional coordinate system generating module, wherein the coordinate system changing module is used for converting the three-dimensional coordinate of the camera to a preset reference surface and determining the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system, and the Z axis of the reference surface is 0;
the camera coordinate acquisition module is used for acquiring homogeneous coordinates of the camera when the camera shoots a welding image;
the welding point coordinate determination module is used for determining the two-dimensional pixel coordinate of the target welding point in the welding image according to the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system and the homogeneous coordinate of the camera;
and the welding guide module is used for guiding welding according to the two-dimensional pixel coordinates of the target welding point.
Optionally, the reference surface has a matrix with a length of 2cm and a width of 2cm and is arranged at 3 × 3.
Optionally, the coordinate system change module is configured to:
and determining the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system according to a preset change coefficient, a preset camera internal reference matrix, a preset rotation matrix and a preset translation vector.
Optionally, the coordinate system change module is configured to determine a correspondence between the reference plane and the two-dimensional virtual pixel coordinate system according to the following formula:
Figure 304353DEST_PATH_IMAGE008
wherein,
Figure 723702DEST_PATH_IMAGE002
the coefficient of variation is represented by a coefficient of variation,
Figure 698611DEST_PATH_IMAGE003
representing homogeneous coordinates of two-dimensional camera plane pixels,
Figure 118091DEST_PATH_IMAGE004
a reference matrix within the camera is represented,
Figure 453126DEST_PATH_IMAGE005
a matrix of rotations is represented, which is,
Figure 8873DEST_PATH_IMAGE006
a translation vector is represented that represents the translation vector,
Figure 138372DEST_PATH_IMAGE007
presentation cameraHomogeneous coordinates of (a).
Optionally, the camera internal reference matrix includes camera internalization parameters, and the camera internalization parameters are determined according to the size of the camera chip and the resolution of the image captured by the camera.
The application relates to the technical field of welding processes, and discloses a welding guiding method and device based on vision. According to the scheme provided by the application, when the camera moves in the three-dimensional space, a camera coordinate can be obtained, and when the camera moves to a certain position in the three-dimensional space, the two-dimensional pixel coordinate of the target welding point is determined, so that welding guidance can be accurately finished.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic workflow diagram of a visual-based welding guidance method according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a vision-based welding guiding device disclosed in an embodiment of the present application.
Detailed Description
In order to solve the technical problem of lack of a welding guiding method in the prior art, the present application discloses a welding guiding method and device based on vision through the following embodiments.
The first embodiment of the present application discloses a visual-based welding guidance method, which, with reference to the workflow diagram shown in fig. 1, includes:
the principle is that a two-dimensional virtual pixel coordinate system is obtained according to a three-dimensional coordinate system of a camera, so that two-dimensional pixel coordinates of a target welding point are obtained, and welding is guided.
And step S1, transforming the three-dimensional coordinates of the camera to a preset reference plane, and determining the corresponding relation between the reference plane and a two-dimensional virtual pixel coordinate system, wherein the Z axis of the reference plane is 0.
In some embodiments of the present application, the datum plane has a 2cm long by 2cm wide matrix and is 3x3 aligned thereon.
In some embodiments of the present application, the determining a correspondence between the reference plane and a two-dimensional virtual pixel coordinate system includes:
and determining the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system according to a preset change coefficient, a preset camera internal reference matrix, a preset rotation matrix and a preset translation vector.
Further, the determining a corresponding relationship between the reference plane and the two-dimensional virtual pixel coordinate system according to a preset change coefficient, a preset camera reference matrix, a preset rotation matrix, and a preset translation vector includes:
determining a correspondence between the reference plane and a two-dimensional virtual pixel coordinate system by the following formula:
Figure 45148DEST_PATH_IMAGE009
wherein,
Figure 934606DEST_PATH_IMAGE002
the coefficient of variation is represented by a coefficient of variation,
Figure 328547DEST_PATH_IMAGE003
representing homogeneous coordinates of two-dimensional camera plane pixels,
Figure 910838DEST_PATH_IMAGE004
a reference matrix within the camera is represented,
Figure 39331DEST_PATH_IMAGE005
a matrix of rotations is represented, which is,
Figure 716169DEST_PATH_IMAGE006
a translation vector is represented that represents the translation vector,
Figure 246508DEST_PATH_IMAGE007
representing the homogeneous coordinates of the camera.
Further, the camera internal parameter matrix comprises camera inherent parameters, and the camera inherent parameters are determined according to the size of the camera chip and the resolution of the camera shooting image.
Step S2, the homogeneous coordinates of the camera are acquired while the camera is capturing the welding image.
And step S3, determining the two-dimensional pixel coordinates of the target welding point in the welding image according to the corresponding relation between the reference plane and the two-dimensional virtual pixel coordinate system and the homogeneous coordinates of the camera.
And step S4, guiding welding according to the two-dimensional pixel coordinates of the target welding point.
Specifically, when a camera shoots a welding image, the three-dimensional space coordinate of the camera and the two-dimensional plane coordinate of a two-dimensional camera plane pixel are obtained, then the three-dimensional space coordinate of the camera and the two-dimensional plane coordinate of the two-dimensional camera plane pixel are respectively converted into homogeneous coordinates, and the homogeneous coordinates of the camera are obtained
Figure 468542DEST_PATH_IMAGE010
The homogeneous coordinate of the planar pixel of the two-dimensional camera is
Figure 333598DEST_PATH_IMAGE011
Therefore, by adding a high latitude variable, the projection transformation is converted into linear transformation, and the subsequent processing is convenient to perform.
It is necessary to prepare a reference plane having a small matrix of 2cm in length and 2cm in width and arranged 3X3
Figure 564860DEST_PATH_IMAGE012
Because of the rotational and translational relationship between two planes in space, the reference plane
Figure 198972DEST_PATH_IMAGE012
The corresponding relation between the two-dimensional camera plane, namely the two-dimensional virtual pixel coordinate system is as follows:
Figure 857487DEST_PATH_IMAGE013
wherein, the camera internal reference matrix
Figure 694993DEST_PATH_IMAGE014
Wherein
Figure 731211DEST_PATH_IMAGE015
and
Figure 970563DEST_PATH_IMAGE016
representing the length and width of the pixels on each camera,
Figure 65558DEST_PATH_IMAGE017
and
Figure 374048DEST_PATH_IMAGE018
coordinates representing the center point of the camera captured image.
Thus, it is possible to obtain
Figure 947112DEST_PATH_IMAGE019
Wherein
Figure 555817DEST_PATH_IMAGE020
Figure 556134DEST_PATH_IMAGE021
And
Figure 351920DEST_PATH_IMAGE022
representing an amount of rotation about three coordinate axes, translation vectors
Figure 463096DEST_PATH_IMAGE023
And represents the amount of translation around 3 coordinate axes.
Transforming three-dimensional world coordinates to a reference plane through a series of transformations
Figure 677039DEST_PATH_IMAGE024
Since M is the reference plane and Z =0, the transformation about the Z axis is negligible, so the new formula is:
Figure 831946DEST_PATH_IMAGE025
Figure 131340DEST_PATH_IMAGE026
order to
Figure 295474DEST_PATH_IMAGE027
Therefore, it is
Figure 363924DEST_PATH_IMAGE028
Therefore, this mapping is required, and the matrix needs to be solved
Figure 440465DEST_PATH_IMAGE029
A matrix therein
Figure 476423DEST_PATH_IMAGE030
Is a 3x3 matrix.
Because of the fact that
Figure 194980DEST_PATH_IMAGE031
And
Figure 367204DEST_PATH_IMAGE032
orthogonal, because of the amount of rotation about the x, y axes
Figure 880225DEST_PATH_IMAGE033
And is and
Figure 154212DEST_PATH_IMAGE034
i.e. rotation does not change dimension, thus making it possible to obtain
Figure 400429DEST_PATH_IMAGE035
Then according to
Figure 443471DEST_PATH_IMAGE036
And after simplification, the following can be obtained:
Figure 111081DEST_PATH_IMAGE037
and because some of the camera's inherent parameters are available, the length and width of the pixels on each camera
Figure 872364DEST_PATH_IMAGE038
And
Figure 650833DEST_PATH_IMAGE039
the coordinates of the center point of the image taken by the camera can be obtained by the size of the chip
Figure 548382DEST_PATH_IMAGE040
And
Figure 668785DEST_PATH_IMAGE041
can be obtained by the resolution of the captured image, so the matrix
Figure 635473DEST_PATH_IMAGE042
Is also known, so can determine
Figure 233944DEST_PATH_IMAGE043
And
Figure 235267DEST_PATH_IMAGE044
finally according to
Figure 260992DEST_PATH_IMAGE045
Determining
Figure 731288DEST_PATH_IMAGE046
Finally, determine
Figure 382718DEST_PATH_IMAGE047
Then according to
Figure 254859DEST_PATH_IMAGE048
The transformation relation between the two coordinate systems can be obtained, and therefore subsequent welding guidance can be achieved according to the transformation relation.
The embodiment of the application discloses a welding guiding method based on vision, which comprises the steps of firstly transforming a three-dimensional coordinate of a camera to a reference plane, determining a corresponding relation between the reference plane and a two-dimensional virtual pixel coordinate system, then acquiring a homogeneous coordinate of the camera when the camera shoots a welding image, determining a two-dimensional pixel coordinate of a target welding point in the welding image according to the corresponding relation between the reference plane and the two-dimensional virtual pixel coordinate system and the homogeneous coordinate of the camera, and finally guiding welding according to the two-dimensional pixel coordinate of the target welding point. According to the scheme provided by the application, when the camera moves in the three-dimensional space, a camera coordinate can be obtained, and when the camera moves to a certain position in the three-dimensional space, the two-dimensional pixel coordinate of the target welding point is determined, so that welding guidance can be accurately finished.
It should be understood that, although the steps in the flowchart of fig. 1 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 1 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
The second embodiment of the present application discloses a visual-based welding guiding device, which is applied to the visual-based welding guiding method disclosed in the first embodiment of the present application, referring to the schematic structural diagram shown in fig. 2, and the visual-based welding guiding device includes:
and the coordinate system changing module 10 is configured to transform the three-dimensional coordinate of the camera to a preset reference plane, and determine a corresponding relationship between the reference plane and a two-dimensional virtual pixel coordinate system, where a Z-axis of the reference plane is 0.
And a camera coordinate acquisition module 20 for acquiring homogeneous coordinates of the camera when the camera takes the welding image.
And the welding point coordinate determining module 30 is configured to determine a two-dimensional pixel coordinate of a target welding point in the welding image according to a correspondence between the reference plane and the two-dimensional virtual pixel coordinate system and the homogeneous coordinate of the camera.
And the welding guiding module 40 is used for guiding welding according to the two-dimensional pixel coordinates of the target welding point.
Further, the reference surface has a matrix with a length of 2cm and a width of 2cm and is arranged at 3x 3.
Further, the coordinate system variation module 10 is configured to:
and determining the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system according to a preset change coefficient, a preset camera internal reference matrix, a preset rotation matrix and a preset translation vector.
Further, the coordinate system changing module 10 is configured to determine a corresponding relationship between the reference plane and the two-dimensional virtual pixel coordinate system according to the following formula:
Figure 700752DEST_PATH_IMAGE049
wherein,
Figure 658344DEST_PATH_IMAGE002
the coefficient of variation is represented by a coefficient of variation,
Figure 598618DEST_PATH_IMAGE003
representing homogeneous coordinates of two-dimensional camera plane pixels,
Figure 574533DEST_PATH_IMAGE004
a reference matrix within the camera is represented,
Figure 207640DEST_PATH_IMAGE005
a matrix of rotations is represented, which is,
Figure 652528DEST_PATH_IMAGE006
a translation vector is represented that represents the translation vector,
Figure 374322DEST_PATH_IMAGE007
representing the homogeneous coordinates of the camera.
Further, the camera intrinsic parameter matrix comprises camera inherent parameters, and the camera inherent parameters are determined according to the size of a camera chip and the resolution of an image shot by the camera.
The present application has been described in detail with reference to specific embodiments and illustrative examples, but the description is not intended to limit the application. Those skilled in the art will appreciate that various equivalent substitutions, modifications or improvements may be made to the presently disclosed embodiments and implementations thereof without departing from the spirit and scope of the present disclosure, and these fall within the scope of the present disclosure. The protection scope of this application is subject to the appended claims.

Claims (10)

1. A vision-based weld guidance method, comprising:
transforming the three-dimensional coordinates of the camera to a preset reference surface, and determining the corresponding relation between the reference surface and a two-dimensional virtual pixel coordinate system, wherein the Z axis of the reference surface is 0;
acquiring homogeneous coordinates of a camera when the camera shoots a welding image;
determining two-dimensional pixel coordinates of a target welding point in the welding image according to the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system and the homogeneous coordinates of the camera;
and guiding welding according to the two-dimensional pixel coordinates of the target welding point.
2. The vision-based weld guidance method of claim 1, wherein the datum plane has a 2cm long by 2cm wide matrix thereon and is 3x3 aligned.
3. The vision-based weld guidance method of claim 1, wherein the determining a correspondence between the reference plane and a two-dimensional virtual pixel coordinate system comprises:
and determining the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system according to a preset change coefficient, a preset camera internal reference matrix, a preset rotation matrix and a preset translation vector.
4. The vision-based weld guidance method of claim 3, wherein the determining the correspondence between the datum plane and the two-dimensional virtual pixel coordinate system according to a preset variation coefficient, a preset camera internal reference matrix, a preset rotation matrix and a preset translation vector comprises:
determining a correspondence between the reference plane and a two-dimensional virtual pixel coordinate system by the following formula:
Figure 311199DEST_PATH_IMAGE002
wherein,
Figure 634864DEST_PATH_IMAGE004
the coefficient of variation is represented by a coefficient of variation,
Figure 404149DEST_PATH_IMAGE006
representing homogeneous coordinates of two-dimensional camera plane pixels,
Figure 642102DEST_PATH_IMAGE008
a reference matrix within the camera is represented,
Figure 331840DEST_PATH_IMAGE010
a matrix of rotations is represented, which is,
Figure 122946DEST_PATH_IMAGE012
a translation vector is represented that represents the translation vector,
Figure 75465DEST_PATH_IMAGE014
representing the homogeneous coordinates of the camera.
5. The vision-based weld guidance method of claim 3, wherein the camera intrinsic parameters matrix includes camera internalization parameters determined from a size of a camera chip and a resolution of a camera captured image.
6. A vision-based weld guiding device applied to the vision-based weld guiding method of any one of claims 1 to 5, the vision-based weld guiding device comprising:
the system comprises a coordinate system changing module, a two-dimensional virtual pixel coordinate system and a three-dimensional coordinate system generating module, wherein the coordinate system changing module is used for converting the three-dimensional coordinate of the camera to a preset reference surface and determining the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system, and the Z axis of the reference surface is 0;
the camera coordinate acquisition module is used for acquiring homogeneous coordinates of the camera when the camera shoots a welding image;
the welding point coordinate determination module is used for determining the two-dimensional pixel coordinate of the target welding point in the welding image according to the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system and the homogeneous coordinate of the camera;
and the welding guide module is used for guiding welding according to the two-dimensional pixel coordinates of the target welding point.
7. The vision-based weld guide of claim 6, wherein the datum plane has a 2cm long by 2cm wide matrix thereon and is 3x3 aligned.
8. The vision-based weld guide of claim 6, wherein the coordinate system change module is to:
and determining the corresponding relation between the reference surface and the two-dimensional virtual pixel coordinate system according to a preset change coefficient, a preset camera internal reference matrix, a preset rotation matrix and a preset translation vector.
9. The vision-based weld guide of claim 8, wherein the coordinate system variation module is configured to determine the correspondence between the reference plane and a two-dimensional virtual pixel coordinate system by:
Figure 352993DEST_PATH_IMAGE016
wherein,
Figure 130194DEST_PATH_IMAGE018
the coefficient of variation is represented by a coefficient of variation,
Figure 378685DEST_PATH_IMAGE020
representing homogeneous coordinates of two-dimensional camera plane pixels,
Figure 975757DEST_PATH_IMAGE022
a reference matrix within the camera is represented,
Figure 994660DEST_PATH_IMAGE024
a matrix of rotations is represented, which is,
Figure 688684DEST_PATH_IMAGE026
a translation vector is represented that represents the translation vector,
Figure DEST_PATH_IMAGE028
representing the homogeneous coordinates of the camera.
10. The vision-based weld guide of claim 8, wherein the camera intrinsic parameters matrix includes camera internalization parameters determined from the size of the camera chip and the resolution of the camera captured image.
CN202210390934.5A 2022-04-14 2022-04-14 Welding guiding method and device based on vision Pending CN114494424A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210390934.5A CN114494424A (en) 2022-04-14 2022-04-14 Welding guiding method and device based on vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210390934.5A CN114494424A (en) 2022-04-14 2022-04-14 Welding guiding method and device based on vision

Publications (1)

Publication Number Publication Date
CN114494424A true CN114494424A (en) 2022-05-13

Family

ID=81488141

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210390934.5A Pending CN114494424A (en) 2022-04-14 2022-04-14 Welding guiding method and device based on vision

Country Status (1)

Country Link
CN (1) CN114494424A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107589782A (en) * 2016-07-06 2018-01-16 可穿戴设备有限公司 Method and apparatus for the ability of posture control interface of wearable device
CN110033083A (en) * 2019-03-29 2019-07-19 腾讯科技(深圳)有限公司 Convolutional neural networks model compression method and apparatus, storage medium and electronic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107589782A (en) * 2016-07-06 2018-01-16 可穿戴设备有限公司 Method and apparatus for the ability of posture control interface of wearable device
CN110033083A (en) * 2019-03-29 2019-07-19 腾讯科技(深圳)有限公司 Convolutional neural networks model compression method and apparatus, storage medium and electronic device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SOLDIER123333(网友): "张正友标定算法原理详解", 《IT610网站》 *

Similar Documents

Publication Publication Date Title
KR101666959B1 (en) Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor
Scaramuzza et al. A flexible technique for accurate omnidirectional camera calibration and structure from motion
CN108648237B (en) Space positioning method based on vision
CN110809786B (en) Calibration device, calibration chart, chart pattern generation device, and calibration method
CN109064404A (en) It is a kind of based on polyphaser calibration panorama mosaic method, panoramic mosaic system
JP6223169B2 (en) Information processing apparatus, information processing method, and program
US20140118500A1 (en) System and method for finding correspondence between cameras in a three-dimensional vision system
CN109087244A (en) A kind of Panorama Mosaic method, intelligent terminal and storage medium
CN104424630A (en) Three-dimension reconstruction method and device, and mobile terminal
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
CN107578450B (en) Method and system for calibrating assembly error of panoramic camera
US20120147149A1 (en) System and method for training a model in a plurality of non-perspective cameras and determining 3d pose of an object at runtime with the same
CN106709865A (en) Depth image synthetic method and device
CN113592721A (en) Photogrammetry method, apparatus, device and storage medium
US11694349B2 (en) Apparatus and a method for obtaining a registration error map representing a level of sharpness of an image
CN107442973B (en) Welding bead positioning method and device based on machine vision
CN114359406A (en) Calibration of auto-focusing binocular camera, 3D vision and depth point cloud calculation method
CN107507133B (en) Real-time image splicing method based on circular tube working robot
CN110490943A (en) Quick method for precisely marking, system and the storage medium of 4D holography capture system
Ding et al. Minimal solutions for panoramic stitching given gravity prior
CN114494424A (en) Welding guiding method and device based on vision
Kurz et al. Bundle adjustment for stereoscopic 3d
CN115457142B (en) Calibration method and system of MR hybrid photographic camera
CN107806861B (en) Inclined image relative orientation method based on essential matrix decomposition
CN111292380A (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220513